Search results for: auditory processing delays
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4031

Search results for: auditory processing delays

2831 Development of an EEG-Based Real-Time Emotion Recognition System on Edge AI

Authors: James Rigor Camacho, Wansu Lim

Abstract:

Over the last few years, the development of new wearable and processing technologies has accelerated in order to harness physiological data such as electroencephalograms (EEGs) for EEG-based applications. EEG has been demonstrated to be a source of emotion recognition signals with the highest classification accuracy among physiological signals. However, when emotion recognition systems are used for real-time classification, the training unit is frequently left to run offline or in the cloud rather than working locally on the edge. That strategy has hampered research, and the full potential of using an edge AI device has yet to be realized. Edge AI devices are computers with high performance that can process complex algorithms. It is capable of collecting, processing, and storing data on its own. It can also analyze and apply complicated algorithms like localization, detection, and recognition on a real-time application, making it a powerful embedded device. The NVIDIA Jetson series, specifically the Jetson Nano device, was used in the implementation. The cEEGrid, which is integrated to the open-source brain computer-interface platform (OpenBCI), is used to collect EEG signals. An EEG-based real-time emotion recognition system on Edge AI is proposed in this paper. To perform graphical spectrogram categorization of EEG signals and to predict emotional states based on input data properties, machine learning-based classifiers were used. Until the emotional state was identified, the EEG signals were analyzed using the K-Nearest Neighbor (KNN) technique, which is a supervised learning system. In EEG signal processing, after each EEG signal has been received in real-time and translated from time to frequency domain, the Fast Fourier Transform (FFT) technique is utilized to observe the frequency bands in each EEG signal. To appropriately show the variance of each EEG frequency band, power density, standard deviation, and mean are calculated and employed. The next stage is to identify the features that have been chosen to predict emotion in EEG data using the K-Nearest Neighbors (KNN) technique. Arousal and valence datasets are used to train the parameters defined by the KNN technique.Because classification and recognition of specific classes, as well as emotion prediction, are conducted both online and locally on the edge, the KNN technique increased the performance of the emotion recognition system on the NVIDIA Jetson Nano. Finally, this implementation aims to bridge the research gap on cost-effective and efficient real-time emotion recognition using a resource constrained hardware device, like the NVIDIA Jetson Nano. On the cutting edge of AI, EEG-based emotion identification can be employed in applications that can rapidly expand the research and implementation industry's use.

Keywords: edge AI device, EEG, emotion recognition system, supervised learning algorithm, sensors

Procedia PDF Downloads 88
2830 Deep Learning-Based Approach to Automatic Abstractive Summarization of Patent Documents

Authors: Sakshi V. Tantak, Vishap K. Malik, Neelanjney Pilarisetty

Abstract:

A patent is an exclusive right granted for an invention. It can be a product or a process that provides an innovative method of doing something, or offers a new technical perspective or solution to a problem. A patent can be obtained by making the technical information and details about the invention publicly available. The patent owner has exclusive rights to prevent or stop anyone from using the patented invention for commercial uses. Any commercial usage, distribution, import or export of a patented invention or product requires the patent owner’s consent. It has been observed that the central and important parts of patents are scripted in idiosyncratic and complex linguistic structures that can be difficult to read, comprehend or interpret for the masses. The abstracts of these patents tend to obfuscate the precise nature of the patent instead of clarifying it via direct and simple linguistic constructs. This makes it necessary to have an efficient access to this knowledge via concise and transparent summaries. However, as mentioned above, due to complex and repetitive linguistic constructs and extremely long sentences, common extraction-oriented automatic text summarization methods should not be expected to show a remarkable performance when applied to patent documents. Other, more content-oriented or abstractive summarization techniques are able to perform much better and generate more concise summaries. This paper proposes an efficient summarization system for patents using artificial intelligence, natural language processing and deep learning techniques to condense the knowledge and essential information from a patent document into a single summary that is easier to understand without any redundant formatting and difficult jargon.

Keywords: abstractive summarization, deep learning, natural language Processing, patent document

Procedia PDF Downloads 110
2829 Hemispheric Locus and Gender Predict the Delay between the Moment of Stroke and Hospitalization

Authors: D. Anderlini, G. Wallis

Abstract:

Background: The number of people experiencing stroke is steadily increasing due to changes in diet and lifestyle, to longer life expectancy resulting in older population, to higher survival rates as a consequence of improvements during the acute phase. This study considers what risk factors might contribute to delayed entry to hospital for treatment. Methods: We analyzed data from 2472 patients admitted to the Stroke Unit of the Royal Brisbane Women's Hospital, Australia, between 2002 to 2011. Results: Previous studies have reported that factors which can contribute to delay include the patient’s age, the time of day, physical location, visit the GP instead of going to the emergency, means of transport, severity of symptoms and type of stroke. Contrary to findings of other studies, we found a strong correlation between side of lesion and delay in admission: patients with right hemisphere lesions had an average delay of 3.78 days, while patients with left hemisphere lesions had an average delay of 1.49 days. Damage to the right hemisphere generally ends in motor impairment in the non-dominant hand and no speech impediment. In contrast, left hemisphere lesions can result in deficit to; dominant hand function and aphasia which will be noticed even if their impact on performance is relatively minor. A finding which goes against many previous studies, is the fact that women get to the hospital much sooner than men, showing an average delay of 0.92 days in women vs. 3.36 days in men. Conclusion: Acute surgical-pharmacological therapies are most effective if applied immediately after stroke. Hence delays to admission can be crucial to the degree of recovery. The tendency of patients to overlook symptoms of right hemisphere lesion should be the target of information campaigns both for the general public and GPs. Why do men go to hospital so late? We don't know yet! Nevertheless an awareness plan specifically direct to male population should be on the agenda of Health Departments.

Keywords: gender, admission delay, stroke location, bioinformatics, biomedicine

Procedia PDF Downloads 214
2828 Natural Gas Flow Optimization Using Pressure Profiling and Isolation Techniques

Authors: Syed Tahir Shah, Fazal Muhammad, Syed Kashif Shah, Maleeha Gul

Abstract:

In recent days, natural gas has become a relatively clean and quality source of energy, which is recovered from deep wells by expensive drilling activities. The recovered substance is purified by processing in multiple stages to remove the unwanted/containments like dust, dirt, crude oil and other particles. Mostly, gas utilities are concerned with essential objectives of quantity/quality of natural gas delivery, financial outcome and safe natural gas volumetric inventory in the transmission gas pipeline. Gas quantity and quality are primarily related to standards / advanced metering procedures in processing units/transmission systems, and the financial outcome is defined by purchasing and selling gas also the operational cost of the transmission pipeline. SNGPL (Sui Northern Gas Pipelines Limited) Pakistan has a wide range of diameters of natural gas transmission pipelines network of over 9125 km. This research results in answer a few of the issues in accuracy/metering procedures via multiple advanced gadgets for gas flow attributes after being utilized in the transmission system and research. The effects of good pressure management in transmission gas pipeline network in contemplation to boost the gas volume deposited in the existing network and finally curbing gas losses UFG (Unaccounted for gas) for financial benefits. Furthermore, depending on the results and their observation, it is directed to enhance the maximum allowable working/operating pressure (MAOP) of the system to 1235 PSIG from the current round about 900 PSIG, such that the capacity of the network could be entirely utilized. In gross, the results depict that the current model is very efficient and provides excellent results in the minimum possible time.

Keywords: natural gas, pipeline network, UFG, transmission pack, AGA

Procedia PDF Downloads 76
2827 Copywriting and the Creative Edge

Authors: Dandeswar Bisoyi, Preeti Yadav, Utpal Barua

Abstract:

This study address particular way that verbal information can affect the processing of positive and interesting qualities which help in making the brand attractive to the consumer. Also, it address the development of a communication strategy which is a very important part of the marketing plan we have to take into account many factors. Out of all the product strengths, the strategy has to outline one marked differential which will drive our brand. This is the fundamental base on which the entire creative strategy will be big idea-based.

Keywords: copy writing, advertisement, marketing, branding, recall

Procedia PDF Downloads 562
2826 Prevalence of Dengue in Sickle Cell Disease in Pre-school Children

Authors: Nikhil A. Gavhane, Sachin Shah, Ishant S. Mahajan, Pawan D. Bahekar

Abstract:

Introduction: Millions of people are affected with dengue fever every year, which drives up healthcare expenses in many low-income countries. Organ failure and other serious symptoms may result. Another worldwide public health problem is sickle cell anaemia, which is most prevalent in Africa, the Caribbean, and Europe. Dengue epidemics have reportedly occurred in locations with a high frequency of sickle cell disease, compounding the health problems in these areas. Aims and Objectives: This study examines dengue infection in sickle cell disease-afflicted pre-schoolers. Method:This Retrospective cohort study examined paediatric patients. Young people with sickle cell disease (SCD), dengue infection, and a control group without SCD or dengue were studied. Data on demographics, SCD consequences, medical treatments, and laboratory findings were gathered to analyse the influence of SCD on dengue severity and clinical outcomes, classified as severe or non-severe by the 2009 WHO classification. Using fever or admission symptoms, the research estimated acute illness duration. Result: Table 1 compares haemoglobin genotype-based dengue episode features in SS, SC, and controls. Table 2 shows that severe dengue cases are older, have longer admission delays, and have particular symptoms. Table 3's multivariate analysis indicates SS genotype's high connection with severe dengue, multiorgan failure, and acute pulmonary problems. Table 4 relates severe dengue to greater white blood cell counts, anaemia, liver enzymes, and reduced lactate dehydrogenase. Conclusion: This study is valuable but confined to hospitalised dengue patients with sickle cell illness. Small cohorts limit comparisons. Further study is needed since findings contradict predictions.

Keywords: dengue, chills, headache, severe myalgia, vomiting, nausea, prostration

Procedia PDF Downloads 54
2825 Listening to Circles, Playing Lights: A Study of Cross-Modal Perception in Music

Authors: Roni Granot, Erica Polini

Abstract:

Music is often described in terms of non-auditory adjectives such as a rising melody, a bright sound, or a zigzagged contour. Such cross modal associations have been studied with simple isolated musical parameters, but only rarely in rich musical contexts. The current study probes cross sensory associations with polarity based dimensions by means of pairings of 10 adjectives: blunt-sharp, relaxed-tense, heavy-light, low (in space)-high, low (pitch)-high, big-small, hard-soft, active-passive, bright-dark, sad-happy. 30 participants (randomly assigned to one of two groups) were asked to rate one of 27 short saxophone improvisations on a 1 to 6 scale where 1 and six correspond to the opposite pole of each dimension. The 27 improvisations included three exemplars for each of three dimensions (size, brightness, sharpness), played by three different players. Here we focus on the question of whether ratings of scales corresponding with the musical dimension were consistently rated as such (e.g. music improvised to represent a white circle rated as bright in contrast with music improvised to represent a dark circle rated as dark). Overall the average scores by dimension showed an upward trend in the equivalent verbal scale, with a low rating for small, bright and sharp musical improvisations and higher scores for large, dark and blunt improvisations. Friedman tests indicate a statistically significant difference for brightness (χ2 (2) = 19.704, p = .000) and sharpness dimensions (χ2 (2) = 15.750, p = .000), but not for size (χ2 (2) = 1.444, p = .486). Post hoc analysis with Wilcoxon signed-rank tests within the brightness dimension, show significant differences among all possible parings resulted in significant differences: the rankings of 'bright' and 'dark' (Z = -3.310, p = .001), of 'bright' and 'medium' (Z = -2.438, p = .015) and of 'dark' and 'medium' music (Z = -2.714, p = .007); but only differences between the extreme contrasts within the sharpness dimension : 'sharp' and 'blunt' music (Z = -3.147, p = .002) and between 'sharp' and 'medium' music rated on the sharpness scale (Z = - 3.054, p = .002), but not between 'medium' and 'blunt' music (Z = -.982, p = .326). In summary our study suggests a privileged link between music and the perceptual and semantic domain of brightness. In contrast, size seems to be very difficult to convey in music, whereas sharpness seems to be mapped onto the two extremes (sharp vs. blunt) rather than continuously. This is nicely reflected in the musical literature in titles and texts which stress the association between music and concepts of light or darkness rather than sharpness or size.

Keywords: audiovisual, brightness, cross-modal perception, cross-sensory correspondences, size, visual angularity

Procedia PDF Downloads 201
2824 Hot Deformability of Si-Steel Strips Containing Al

Authors: Mohamed Yousef, Magdy Samuel, Maha El-Meligy, Taher El-Bitar

Abstract:

The present work is dealing with 2% Si-steel alloy. The alloy contains 0.05% C as well as 0.85% Al. The alloy under investigation would be used for electrical transformation purposes. A heating (expansion) - cooling (contraction) dilation investigation was executed to detect the a, a+g, and g transformation temperatures at the inflection points of the dilation curve. On heating, primary a  was detected at a temperature range between room temperature and 687 oC. The domain of a+g was detected in the range between 687 oC and 746 oC. g phase exists in the closed g region at the range between 746 oC and 1043 oC. The domain of a phase appears again at a temperature range between 1043 and 1105 oC, and followed by secondary a at temperature higher than 1105 oC. A physical simulation of thermo-mechanical processing on the as-cast alloy was carried out. The simulation process took into consideration the hot flat rolling pilot plant parameters. The process was executed on the thermo-mechanical simulator (Gleeble 3500). The process was designed to include seven consecutive passes. The 1st pass represents the roughing stage, while the remaining six passes represent finish rolling stage. The whole process was executed at the temperature range from 1100 oC to 900 oC. The amount of strain starts with 23.5% at the roughing pass and decreases continuously to reach 7.5 % at the last finishing pass. The flow curve of the alloy can be abstracted from the stress-strain curves representing simulated passes. It shows alloy hardening from a pass to the other up to pass no. 6, as a result of decreasing the deformation temperature and increasing of cumulative strain. After pass no. 6, the deformation process enhances the dynamic recrystallization phenomena to appear, where the z-parameter would be high.

Keywords: si- steel, hot deformability, critical transformation temperature, physical simulation, thermo-mechanical processing, flow curve, dynamic softening.

Procedia PDF Downloads 229
2823 Heavy Metal Contents in Vegetable Oils of Kazakhstan Origin and Life Risk Assessment

Authors: A. E. Mukhametov, M. T. Yerbulekova, D. R. Dautkanova, G. A. Tuyakova, G. Aitkhozhayeva

Abstract:

The accumulation of heavy metals in food is a constant problem in many parts of the world. Vegetable oils are widely used, both for cooking and for processing in the food industry, meeting the main dietary requirements. One of the main chemical pollutants, heavy metals, is usually found in vegetable oils. These chemical pollutants are carcinogenic, teratogenic and immunotoxic, harmful to consumption and have a negative effect on human health even in trace amounts. Residues of these substances can easily accumulate in vegetable oil during cultivation, processing and storage. In this article, the content of the concentration of heavy metal ions in vegetable oils of Kazakhstan production is studied: sunflower, rapeseed, safflower and linseed oil. Heavy metals: arsenic, cadmium, lead and nickel, were determined in three repetitions by the method of flame atomic absorption. Analysis of vegetable oil samples revealed that the largest lead contamination (Pb) was determined to be 0.065 mg/kg in linseed oil. The content of cadmium (Cd) in the largest amount of 0.009 mg/kg was found in safflower oil. Arsenic (As) content was determined in rapeseed and safflower oils at 0.003 mg/kg, and arsenic (As) was not detected in linseed and sunflower oil. The nickel (Ni) content in the largest amount of 0.433 mg/kg was in linseed oil. The heavy metal contents in the test samples complied with the requirements of regulatory documents for vegetable oils. An assessment of the health risk of vegetable oils with a daily consumption of 36 g per day shows that all samples of vegetable oils produced in Kazakhstan are safe for consumption. But further monitoring is needed, since all these metals are toxic and their harmful effects become apparent only after several years of exposure.

Keywords: vegetable oil, sunflower oil, linseed oil, safflower oil, toxic metals, food safety, rape oil

Procedia PDF Downloads 118
2822 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 413
2821 The Use of Political Savviness in Dealing with Workplace Ostracism: A Social Information Processing Perspective

Authors: Amy Y. Wang, Eko L. Yi

Abstract:

Can vicarious experiences of workplace ostracism affect employees’ willingness to voice? Given the increasingly interdependent nature of the modern workplace in which employees rely on social interactions to fulfill organizational goals, workplace ostracism –the extent to which an individual perceives that he or she is ignored or excluded by others in the workplace– has garnered significant interest from scholars and practitioners alike. Extending beyond conventional studies that largely focus on the perspectives and outcomes of ostracized targets, we address the indirect effects of workplace ostracism on third-party employees embedded in the same social context. Using a social information processing approach, we propose that the ostracism of coworkers acts as political information that influences third-party employees in their decisions to engage in risky and discretionary behaviors such as employee voice. To make sense of and to navigate through experiences of workplace ostracism, we posit that both political understanding and political skill allow third party employees to minimize the risks and uncertainty of voicing. This conceptual model was tested by a study involving 154 supervisor-subordinate dyads of a publicly listed bio-technology firm located in Mainland China. Each supervisor and their direct subordinates composed of a work team; each team had a minimum of two subordinates and a maximum of four subordinates. Human resources used the master list to distribute the ID coded questionnaires to the matching names. All studied constructs were measured using existing scales proved effective in previous literature. Hypotheses were tested using Confirmatory Factor Analysis and Hierarchal Multiple Regression. All three hypotheses were supported which showed that employees were less likely to engage in voice behaviors when their coworkers reported having experienced ostracism in the workplace. Results also showed a significant three-way interaction between political understanding and political skill on the relationship between coworkers’ ostracism and employee voice, indicating that political savviness is a valuable resource in mitigating ostracism’s negative and indirect effects. Our results illustrated that an employee’s coworkers being ostracized indeed adversely impacted his or her own voice behavior. However, not all individuals reacted passively to the social context; rather, we found that politically savvy individuals – possessing both political understanding and political skill – and their voice behaviors were less impacted by ostracism in their work environment. At the same time, we found that having only political understanding or only political skill was significantly less effective in mitigating ostracism’s negative effects, suggesting a necessary duality of political knowledge and political skill in combatting ostracism. Organizational implications, recommendations, and future research ideas are also discussed.

Keywords: employee voice, organizational politics, social information processing, workplace ostracism

Procedia PDF Downloads 121
2820 Controllable Modification of Glass-Crystal Composites with Ion-Exchange Technique

Authors: Andrey A. Lipovskii, Alexey V. Redkov, Vyacheslav V. Rusan, Dmitry K. Tagantsev, Valentina V. Zhurikhina

Abstract:

The presented research is related to the development of recently proposed technique of the formation of composite materials, like optical glass-ceramics, with predetermined structure and properties of the crystalline component. The technique is based on the control of the size and concentration of the crystalline grains using the phenomenon of glass-ceramics decrystallization (vitrification) induced by ion-exchange. This phenomenon was discovered and explained in the beginning of the 2000s, while related theoretical description was given in 2016 only. In general, the developed theory enables one to model the process and optimize the conditions of ion-exchange processing of glass-ceramics, which provide given properties of crystalline component, in particular, profile of the average size of the crystalline grains. The optimization is possible if one knows two dimensionless parameters of the theoretical model. One of them (β) is the value which is directly related to the solubility of crystalline component of the glass-ceramics in the glass matrix, and another (γ) is equal to the ratio of characteristic times of ion-exchange diffusion and crystalline grain dissolution. The presented study is dedicated to the development of experimental technique and simulation which allow determining these parameters. It is shown that these parameters can be deduced from the data on the space distributions of diffusant concentrations and average size of crystalline grains in the glass-ceramics samples subjected to ion-exchange treatment. Measurements at least at two temperatures and two processing times at each temperature are necessary. The composite material used was a silica-based glass-ceramics with crystalline grains of Li2OSiO2. Cubical samples of the glass-ceramics (6x6x6 mm3) underwent the ion exchange process in NaNO3 salt melt at 520 oC (for 16 and 48 h), 540 oC (for 8 and 24 h), 560 oC (for 4 and 12 h), and 580 oC (for 2 and 8 h). The ion exchange processing resulted in the glass-ceramics vitrification in the subsurface layers where ion-exchange diffusion took place. Slabs about 1 mm thick were cut from the central part of the samples and their big facets were polished. These slabs were used to find profiles of diffusant concentrations and average size of the crystalline grains. The concentration profiles were determined from refractive index profiles measured with Max-Zender interferometer, and profiles of the average size of the crystalline grains were determined with micro-Raman spectroscopy. Numerical simulation were based on the developed theoretical model of the glass-ceramics decrystallization induced by ion exchange. The simulation of the processes was carried out for different values of β and γ parameters under all above-mentioned ion exchange conditions. As a result, the temperature dependences of the parameters, which provided a reliable coincidence of the simulation and experimental data, were found. This ensured the adequate modeling of the process of the glass-ceramics decrystallization in 520-580 oC temperature interval. Developed approach provides a powerful tool for fine tuning of the glass-ceramics structure, namely, concentration and average size of crystalline grains.

Keywords: diffusion, glass-ceramics, ion exchange, vitrification

Procedia PDF Downloads 258
2819 Clinical Advice Services: Using Lean Chassis to Optimize Nurse-Driven Telephonic Triage of After-Hour Calls from Patients

Authors: Eric Lee G. Escobedo-Wu, Nidhi Rohatgi, Fouzel Dhebar

Abstract:

It is challenging for patients to navigate through healthcare systems after-hours. This leads to delays in care, patient/provider dissatisfaction, inappropriate resource utilization, readmissions, and higher costs. It is important to provide patients and providers with effective clinical decision-making tools to allow seamless connectivity and coordinated care. In August 2015, patient-centric Stanford Health Care established Clinical Advice Services (CAS) to provide clinical decision support after-hours. CAS is founded on key Lean principles: Value stream mapping, empathy mapping, waste walk, takt time calculations, standard work, plan-do-check-act cycles, and active daily management. At CAS, Clinical Assistants take the initial call and manage all non-clinical calls (e.g., appointments, directions, general information). If the patient has a clinical symptom, the CAS nurses take the call and utilize standardized clinical algorithms to triage the patient to home, clinic, urgent care, emergency department, or 911. Nurses may also contact the on-call physician based on the clinical algorithm for further direction and consultation. Since August 2015, CAS has managed 228,990 calls from 26 clinical specialties. Reporting is built into the electronic health record for analysis and data collection. 65.3% of the after-hours calls are clinically related. Average clinical algorithm adherence rate has been 92%. An average of 9% of calls was escalated by CAS nurses to the physician on call. An average of 5% of patients was triaged to the Emergency Department by CAS. Key learnings indicate that a seamless connectivity vision, cascading, multidisciplinary ownership of the problem, and synergistic enterprise improvements have contributed to this success while striving for continuous improvement.

Keywords: after hours phone calls, clinical advice services, nurse triage, Stanford Health Care

Procedia PDF Downloads 160
2818 Tribological Properties of Non-Stick Coatings Used in Bread Baking Process

Authors: Maurice Brogly, Edwige Privas, Rajesh K. Gajendran, Sophie Bistac

Abstract:

Anti-sticky coatings based on perfluoroalkoxy (PFA) coatings are widely used in food processing industry especially for bread making. Their tribological performance, such as low friction coefficient, low surface energy and high heat resistance, make them an appropriate choice for anti-sticky coating application in moulds for food processing industry. This study is dedicated to evidence the transfer of contaminants from the coating due to wear and thermal ageing of the mould. The risk of contamination is induced by the damage of the coating by bread crust during the demoulding stage. The study focuses on the wear resistance and potential transfer of perfluorinated polymer from the anti-sticky coating. Friction between perfluorinated coating and bread crust is modeled by a tribological pin-on-disc test. The cellular nature of the bread crust is modeled by a polymer foam. FTIR analysis of the polymer foam after friction allow the evaluation of the transfer from the perfluorinated coating to polymer foam. Influence of thermal ageing on the physical, chemical and wear properties of the coating are also investigated. FTIR spectroscopic results show that the increase of PFA transfer onto the foam counterface is associated to the decrease of the friction coefficient. Increasing lubrication by film transfer results in the decrease of the friction coefficient. Moreover increasing the friction test parameters conditions (load, speed and sliding distance) also increase the film transfer onto the counterface. Thermal ageing increases the hydrophobic character of the PFA coating and thus also decreases the friction coefficient.

Keywords: fluorobased polymer coatings, FTIR spectroscopy, non-stick food moulds, wear and friction

Procedia PDF Downloads 299
2817 The Effect of Common Daily Schedule on the Human Circadian Rhythms during the Polar Day on Svalbard: Field Study

Authors: Kamila Weissova, Jitka Skrabalova, Katerina Skalova, Jana Koprivova, Zdenka Bendova

Abstract:

Any Arctic visitor has to deal with extreme conditions, including constant light during the summer season or constant darkness during winter time. Light/dark cycle is the most powerful synchronizing signal for biological clock and the absence of daily dark period during the polar day can significantly alter the functional state of the internal clock. However, the inner clock can be synchronized by other zeitgebers such as physical activity, food intake or social interactions. Here, we investigated the effect of polar day on circadian clock of 10 researchers attending the polar base station in the Svalbard region during July. The data obtained on Svalbard were compared with the data obtained before the researchers left for the expedition (in the Czech Republic). To determine the state of circadian clock we used wrist actigraphy followed by sleep diaries, saliva, and buccal mucosa samples, both collected every 4 hours during 24h-interval to detect melatonin by radioimmunoassay and clock gene (PER1, BMAL1, NR1D1, DBP) mRNA levels by RT-qPCR. The clock gene expression was analyzed using cosinor analysis. From our results, it is apparent that the constant sunlight delayed melatonin onset and postponed the physical activity in the same order. Nevertheless, the clock gene expression displayed higher amplitude on Svalbard compared to the amplitude detected in the Czech Republic. These results have suggested that the common daily schedule at the Svalbard expedition can strengthen circadian rhythm in the environment that is lacking light/dark cycle. In conclusion, the constant sunlight delays melatonin onset, but it still maintains its rhythmic secretion. The effect of constant sunlight on circadian clock can be minimalized by common daily scheduled activity.

Keywords: actighraph, clock genes, human, melatonin, polar day

Procedia PDF Downloads 155
2816 The Effectiveness of First World Asylum Practices in Deterring Applications, Offering Bureaucratic Deniability, and Violating Human Rights: A Greek Case Study

Authors: Claudia Huerta, Pepijn Doornenbal, Walaa Elsiddig

Abstract:

Rising waves of nationalism around the world have led first-world migration receiving countries to exploit the ambiguity of international refugee law and establish asylum application processes that deter applications, allow for bureaucratic deniability, and violate human rights. This case study of Greek asylum application practices argues that the 'pre-application' asylum process in Greece violates the spirit of international law by making it incredibly difficult for potential asylum seekers to apply for asylum, in essence violating the human rights of thousands of asylum seekers. This study’s focus is on the Greek mainland’s asylum 'pre-application' process, which in 2016 began to require those wishing to apply for asylum to do so during extremely restricted hours via a basic Skype line. The average wait to simply begin the registration process to apply for asylum is 81 days, during which time applicants are forced to live illegally in Greece. This study’s methodology in analyzing the 'pre-application' process consists of hours of interviews with asylum seekers, NGOs, and the Asylum Service office on the ground in Athens, as well as an analysis of the Greek Asylum Service historical asylum registration statistics. This study presents three main findings: the delays associated with the Skype system in Greece are the result of system design, as proven by a statistical analysis of Greek asylum registrations, NGOs have been co-opted by the state to perform state functions during the process, and the government’s use of technology is both purposefully lazy and discriminatory. In conclusion, the study argues that such asylum practices are part of a pattern of first-world migration receiving countries policies’ which discourage asylum seekers from applying and fall short of the standards in international law.

Keywords: asylum, European Union, governance, Greece, irregular, migration, policy, refugee, Skype

Procedia PDF Downloads 111
2815 Parallelization of Random Accessible Progressive Streaming of Compressed 3D Models over Web

Authors: Aayushi Somani, Siba P. Samal

Abstract:

Three-dimensional (3D) meshes are data structures, which store geometric information of an object or scene, generally in the form of vertices and edges. Current technology in laser scanning and other geometric data acquisition technologies acquire high resolution sampling which leads to high resolution meshes. While high resolution meshes give better quality rendering and hence is used often, the processing, as well as storage of 3D meshes, is currently resource-intensive. At the same time, web applications for data processing have become ubiquitous owing to their accessibility. For 3D meshes, the advancement of 3D web technologies, such as WebGL, WebVR, has enabled high fidelity rendering of huge meshes. However, there exists a gap in ability to stream huge meshes to a native client and browser application due to high network latency. Also, there is an inherent delay of loading WebGL pages due to large and complex models. The focus of our work is to identify the challenges faced when such meshes are streamed into and processed on hand-held devices, owing to its limited resources. One of the solutions that are conventionally used in the graphics community to alleviate resource limitations is mesh compression. Our approach deals with a two-step approach for random accessible progressive compression and its parallel implementation. The first step includes partition of the original mesh to multiple sub-meshes, and then we invoke data parallelism on these sub-meshes for its compression. Subsequent threaded decompression logic is implemented inside the Web Browser Engine with modification of WebGL implementation in Chromium open source engine. This concept can be used to completely revolutionize the way e-commerce and Virtual Reality technology works for consumer electronic devices. These objects can be compressed in the server and can be transmitted over the network. The progressive decompression can be performed on the client device and rendered. Multiple views currently used in e-commerce sites for viewing the same product from different angles can be replaced by a single progressive model for better UX and smoother user experience. Can also be used in WebVR for commonly and most widely used activities like virtual reality shopping, watching movies and playing games. Our experiments and comparison with existing techniques show encouraging results in terms of latency (compressed size is ~10-15% of the original mesh), processing time (20-22% increase over serial implementation) and quality of user experience in web browser.

Keywords: 3D compression, 3D mesh, 3D web, chromium, client-server architecture, e-commerce, level of details, parallelization, progressive compression, WebGL, WebVR

Procedia PDF Downloads 156
2814 Memory-Guided Oculomotor Task in High School Football Players with ADHD, Post-Concussive Injuries, and Controls

Authors: B. McGovern, J. F. Luck, A. Gade, I. V. Lake, D. O’Connell, H. C. Cutcliffe, K. P. Shah, E. E. Ginalis, C. M. Lambert, N. Christian, J. R. Kait, A. W. Yu, C. P. Eckersley, C. R. Bass

Abstract:

Mild traumatic brain injury (mTBI) in the form of post-concussive injuries and attention deficit / hyperactivity disorder (ADHD) share similar cognitive impairments, including impaired working memory and executive function. The memory-guided oculomotor task separates working memory and inhibitory components to provide further information on the nature of these deficits in each pathology. Eleven subjects with ADHD, fifteen control subjects, and ten subjects with recent concussive injury were matched on age, gender, and education (all high school-age males). Eye movements were recorded during memory-guided oculomotor tasks with varying delays using EyeLink 1000 (SR Research). The percentage of premature saccades and the latency of correct response are the analyzed measures for response inhibition and working memory, respectively. No significant differences were found in latencies between controls subjects and subjects with ADHD or post-concussive injuries, in accordance with previous studies. Subjects with ADHD and post-concussive injuries both demonstrated a trend of increased percentages of premature saccades compared to control subjects in the same oculomotor task. This trend reached statistical significance between the post-concussive and control groups (p < 0.05). These findings support the primary nature of the executive function deficits in response inhibition in ADHD and mTBI. The interpretation of results is limited by the small sample size and the exploratory nature of the study. Further investigation into oculomotor performance differences in mTBI and ADHD may help in differentiating these pathologies in consequent diagnoses and provide insight into the interaction of these deficits in mTBI.

Keywords: attention deficit / hyperactivity disorder (ADHD), concussion, diagnosis, oculomotor, pediatrics

Procedia PDF Downloads 288
2813 Electrospun Membrane doped with Gold Nanorods for Surface-Enhanced Raman Sepctroscopy

Authors: Ziwei Wang, Andrea Lucotti, Luigi Brambilla, Matteo Tommasini, Chiara Bertarelli

Abstract:

Surface-enhanced Raman Spectroscopy (SERS) is a highly sensitive detection that provides abundant information on low concentration analytes from various researching areas. Based on localized surface plasmon resonance, metal nanostructures including gold, silver and copper have been investigated as SERS substrate during recent decades. There has been increasing more attention of exploring good performance, homogenous, repeatable SERS substrates. Here, we show that electrospinning, which is an inexpensive technique to fabricate large-scale, self-standing and repeatable membranes, can be effectively used for producing SERS substrates. Nanoparticles and nanorods are added to the feed electrospinning solution to collect functionalized polymer fibrous mats. We report stable electrospun membranes as SERS substrate using gold nanorods (AuNRs) and poly(vinyl alcohol). Particularly, a post-processing crosslinking step using glutaraldehyde under acetone environment was carried out to the electrospun membrane. It allows for using the membrane in any liquid environment, including water, which is of interest both for sensing of contaminant in wastewater, as well as for biosensing. This crosslinked AuNRs/PVA membrane has demonstrated excellent performance as SERS substrate for low concentration 10-6 M Rhodamine 6G (Rh6G) aqueous solution. This post-processing for fabricating SERS substrate is the first time reported and proved through Raman imaging of excellent stability and outstanding performance. Finally, SERS tests have been applied to several analytes, and the application of AuNRs/PVA membrane is broadened by removing the detected analyte by rinsing. Therefore, this crosslinked AuNRs/PVA membrane is re-usable.

Keywords: SERS spectroscopy, electrospinning, crosslinking, composite materials

Procedia PDF Downloads 129
2812 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic

Authors: Michael Lousis

Abstract:

The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.

Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors

Procedia PDF Downloads 299
2811 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models

Authors: Danielle Shackley, Yetunde Folajimi

Abstract:

As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.

Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model

Procedia PDF Downloads 82
2810 The Significance of a Well-Defined Systematic Approach in Risk Management for Construction Projects within Oil Industry

Authors: Batool Ismaeel, Umair Farooq, Saad Mushtaq

Abstract:

Construction projects in the oil industry can be very complex, having unknown outcomes and uncertainties that cannot be easily predicted. Each project has its unique risks generated by a number of factors which, if not controlled, will impact the successful completion of the project mainly in terms of schedule, cost, quality, and safety. This paper highlights the historic risks associated with projects in the south and east region of Kuwait Oil Company (KOC) collated from the company’s lessons learned database. Starting from Contract Award through to handover of the project to the Asset owner, the gaps in project execution in terms of managing risk will be brought to discussion and where a well-defined systematic approach in project risk management reflecting many claims, change of scope, exceeding budget, delays in engineering phase as well as in the procurement and fabrication of long lead items should be adopted. This study focuses on a proposed feasible approach in risk management for engineering, procurement and construction (EPC) level projects including the various stakeholders involved in executing the works from International to local contractors and vendors in KOC. The proposed approach covers the areas categorized into organizational, design, procurement, construction, pre-commissioning, commissioning and project management in which the risks are identified and require management and mitigation. With the effective deployment and implementation of the proposed risk management system and the consideration of it as a vital key in achieving the project’s target, the outcomes will be more predictable in the future, and the risk triggers will be managed and controlled. The correct resources can be allocated on a timely basis for the company for avoiding any unpredictable outcomes during the execution of the project. It is recommended in this paper to apply this risk management approach as an integral part of project management and investigate further in the future, the effectiveness of this proposed system for newly awarded projects and compare the same with those projects of similar budget/complexity that have not applied this approach to risk management.

Keywords: construction, project completion, risk management, uncertainties

Procedia PDF Downloads 138
2809 Development of the Integrated Quality Management System of Cooked Sausage Products

Authors: Liubov Lutsyshyn, Yaroslava Zhukova

Abstract:

Over the past twenty years, there has been a drastic change in the mode of nutrition in many countries which has been reflected in the development of new products, production techniques, and has also led to the expansion of sales markets for food products. Studies have shown that solution of the food safety problems is almost impossible without the active and systematic activity of organizations directly involved in the production, storage and sale of food products, as well as without management of end-to-end traceability and exchange of information. The aim of this research is development of the integrated system of the quality management and safety assurance based on the principles of HACCP, traceability and system approach with creation of an algorithm for the identification and monitoring of parameters of technological process of manufacture of cooked sausage products. Methodology of implementation of the integrated system based on the principles of HACCP, traceability and system approach during the manufacturing of cooked sausage products for effective provision for the defined properties of the finished product has been developed. As a result of the research evaluation technique and criteria of performance of the implementation and operation of the system of the quality management and safety assurance based on the principles of HACCP have been developed and substantiated. In the paper regularities of influence of the application of HACCP principles, traceability and system approach on parameters of quality and safety of the finished product have been revealed. In the study regularities in identification of critical control points have been determined. The algorithm of functioning of the integrated system of the quality management and safety assurance has also been described and key requirements for the development of software allowing the prediction of properties of finished product, as well as the timely correction of the technological process and traceability of manufacturing flows have been defined. Based on the obtained results typical scheme of the integrated system of the quality management and safety assurance based on HACCP principles with the elements of end-to-end traceability and system approach for manufacture of cooked sausage products has been developed. As a result of the studies quantitative criteria for evaluation of performance of the system of the quality management and safety assurance have been developed. A set of guidance documents for the implementation and evaluation of the integrated system based on the HACCP principles in meat processing plants have also been developed. On the basis of the research the effectiveness of application of continuous monitoring of the manufacturing process during the control on the identified critical control points have been revealed. The optimal number of critical control points in relation to the manufacture of cooked sausage products has been substantiated. The main results of the research have been appraised during 2013-2014 under the conditions of seven enterprises of the meat processing industry and have been implemented at JSC «Kyiv meat processing plant».

Keywords: cooked sausage products, HACCP, quality management, safety assurance

Procedia PDF Downloads 234
2808 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel

Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani

Abstract:

Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.

Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry

Procedia PDF Downloads 257
2807 Revolutionizing Healthcare Communication: The Transformative Role of Natural Language Processing and Artificial Intelligence

Authors: Halimat M. Ajose-Adeogun, Zaynab A. Bello

Abstract:

Artificial Intelligence (AI) and Natural Language Processing (NLP) have transformed computer language comprehension, allowing computers to comprehend spoken and written language with human-like cognition. NLP, a multidisciplinary area that combines rule-based linguistics, machine learning, and deep learning, enables computers to analyze and comprehend human language. NLP applications in medicine range from tackling issues in electronic health records (EHR) and psychiatry to improving diagnostic precision in orthopedic surgery and optimizing clinical procedures with novel technologies like chatbots. The technology shows promise in a variety of medical sectors, including quicker access to medical records, faster decision-making for healthcare personnel, diagnosing dysplasia in Barrett's esophagus, boosting radiology report quality, and so on. However, successful adoption requires training for healthcare workers, fostering a deep understanding of NLP components, and highlighting the significance of validation before actual application. Despite prevailing challenges, continuous multidisciplinary research and collaboration are critical for overcoming restrictions and paving the way for the revolutionary integration of NLP into medical practice. This integration has the potential to improve patient care, research outcomes, and administrative efficiency. The research methodology includes using NLP techniques for Sentiment Analysis and Emotion Recognition, such as evaluating text or audio data to determine the sentiment and emotional nuances communicated by users, which is essential for designing a responsive and sympathetic chatbot. Furthermore, the project includes the adoption of a Personalized Intervention strategy, in which chatbots are designed to personalize responses by merging NLP algorithms with specific user profiles, treatment history, and emotional states. The synergy between NLP and personalized medicine principles is critical for tailoring chatbot interactions to each user's demands and conditions, hence increasing the efficacy of mental health care. A detailed survey corroborated this synergy, revealing a remarkable 20% increase in patient satisfaction levels and a 30% reduction in workloads for healthcare practitioners. The poll, which focused on health outcomes and was administered to both patients and healthcare professionals, highlights the improved efficiency and favorable influence on the broader healthcare ecosystem.

Keywords: natural language processing, artificial intelligence, healthcare communication, electronic health records, patient care

Procedia PDF Downloads 58
2806 An Analysis of Learners’ Reports for Measuring Co-Creational Education

Authors: Takatoshi Ishii, Koji Kimita, Keiichi Muramatsu, Yoshiki Shimomura

Abstract:

To increase the quality of learning, teacher and learner need mutual effort for realization of educational value. For this purpose, we need to manage the co-creational education among teacher and learners. In this research, we try to find a feature of co-creational education. To be more precise, we analyzed learners’ reports by natural language processing, and extract some features that describe the state of the co-creational education.

Keywords: co-creational education, e-portfolios, ICT integration, latent dirichlet allocation

Procedia PDF Downloads 603
2805 Quantifying the Impact of Intermittent Signal Priority given to BRT on Ridership and Climate-A Case Study of Ahmadabad

Authors: Smita Chaudhary

Abstract:

Traffic in India are observed uncontrolled, and are characterized by chaotic (not follows the lane discipline) traffic situation. Bus Rapid Transit (BRT) has emerged as a viable option to enhance transportation capacity and provide increased levels of mobility and accessibility. At present in Ahmadabad there are as many intersections which face the congestion and delay at signalized intersection due to transit (BRT) lanes. Most of the intersection in spite of being signalized is operated manually due to the conflict between BRT buses and heterogeneous traffic. Though BRTS in Ahmadabad has an exclusive lane of its own but with this comes certain limitations which Ahmadabad is facing right now. At many intersections in Ahmadabad due to these conflicts, interference, and congestion both heterogeneous traffic as well as transit buses suffer traffic delays of remarkable 3-4 minutes at each intersection which has a become an issue of great concern. There is no provision of BRT bus priority due to which existing signals have their least role to play in managing the traffic that ultimately call for manual operation. There is an immense decrement in the daily ridership of BRTS because people are finding this transit mode no more time saving in their routine, there is an immense fall in ridership ultimately leading to increased number of private vehicles, idling of vehicles at intersection cause air and noise pollution. In order to bring back these commuters’ transit facilities need to be improvised. Classified volume count survey, travel time delay survey was conducted and revised signal design was done for whole study stretch having three intersections and one roundabout, later one intersection was simulated in order to see the effect of giving priority to BRT on side street queue length and travel time for heterogeneous traffic. This paper aims at suggesting the recommendations in signal cycle, introduction of intermittent priority for transit buses, simulation of intersection in study stretch with proposed signal cycle using VISSIM in order to make this transit amenity feasible and attracting for commuters in Ahmadabad.

Keywords: BRT, priority, Ridership, Signal, VISSIM

Procedia PDF Downloads 430
2804 An EBSD Investigation of Ti-6Al-4Nb Alloy Processed by Plan Strain Compression Test

Authors: Anna Jastrzebska, K. S. Suresh, T. Kitashima, Y. Yamabe-Mitarai, Z. Pakiela

Abstract:

Near α titanium alloys are important materials for aerospace applications, especially in high temperature applications such as jet engine. Mechanical properties of Ti alloys strongly depends on their processing route, then it is very important to understand micro-structure change by different processing. In our previous study, Nb was found to improve oxidation resistance of Ti alloys. In this study, micro-structure evolution of Ti-6Al-4Nb (wt %) alloy was investigated after plain strain compression test in hot working temperatures in the α and β phase region. High-resolution EBSD was successfully used for precise phase and texture characterization of this alloy. 1.1 kg of Ti-6Al-4Nb ingot was prepared using cold crucible levitation melting. The ingot was subsequently homogenized in 1050 deg.C for 1h followed by cooling in the air. Plate like specimens measuring 10×20×50 mm3 were cut from an ingot by electrical discharge machining (EDM). The plain strain compression test using an anvil with 10 x 35 mm in size was performed with 3 different strain rates: 0.1s-1, 1s-1and 10s-1 in 700 deg.C and 1050 deg.C to obtain 75% of deformation. The micro-structure was investigated by scanning electron microscopy (SEM) equipped with electron backscatter diffraction (EBSD) detector. The α/β phase ratio and phase morphology as well as the crystallographic texture, subgrain size, misorientation angles and misorientation gradients corresponding to each phase were determined over the middle and the edge of sample areas. The deformation mechanism in each working temperature was discussed. The evolution of texture changes with strain rate was investigated. The micro-structure obtained by plain strain compression test was heterogeneous with a wide range of grain sizes. This is because deformation and dynamic recrystallization occurred during deformation at temperature in the α and β phase. It was strongly influenced by strain rate.

Keywords: EBSD, plain strain compression test, Ti alloys

Procedia PDF Downloads 369
2803 Quality Analysis of Vegetables Through Image Processing

Authors: Abdul Khalique Baloch, Ali Okatan

Abstract:

The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.

Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria

Procedia PDF Downloads 53
2802 Evaluating and Improving Healthcare Staff Knowledge of the [NG179] NICE Guidelines on Elective Surgical Care during the COVID-19 Pandemic: A Quality Improvement Project

Authors: Stavroula Stavropoulou-Tatla, Danyal Awal, Mohammad Ayaz Hossain

Abstract:

The first wave of the COVID-19 pandemic saw several countries issue guidance postponing all non-urgent diagnostic evaluations and operations, leading to an estimated backlog of 28 million cases worldwide and over 4 million in the UK alone. In an attempt to regulate the resumption of elective surgical activity, the National Institute for Health and Care Excellence (NICE) introduced the ‘COVID-19 rapid guideline [NG179]’. This project aimed to increase healthcare staff knowledge of the aforementioned guideline to a targeted score of 100% in the disseminated questionnaire within 3 months at the Royal Free Hospital. A standardized online questionnaire was used to assess the knowledge of surgical and medical staff at baseline and following each 4-week-long Plan-Study-Do-Act (PDSA) cycle. During PDSA1, the A4 visual summary accompanying the guideline was visibly placed in all relevant clinical areas and the full guideline was distributed to the staff in charge together with a short briefing on the salient points. PDSA2 involved brief small-group teaching sessions. A total of 218 responses was collected. Mean percentage scores increased significantly from 51±19% at baseline to 81±16% after PDSA1 (t=10.32, p<0.0001) and further to 93±8% after PDSA2 (t=4.9, p<0.0001), with 54% of participants achieving a perfect score. In conclusion, the targeted distribution of guideline printouts and visual aids, combined with small-group teaching sessions, were simple and effective ways of educating healthcare staff about the new standards of elective surgical care at the time of COVID-19. This could facilitate the safe restoration of surgical activity, which is critical in order to mitigate the far-reaching consequences of surgical delays on an unprecedented scale during a time of great crisis and uncertainty.

Keywords: COVID-19, elective surgery, NICE guidelines, quality improvement

Procedia PDF Downloads 175