Search results for: task messages
699 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images
Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn
Abstract:
The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation
Procedia PDF Downloads 357698 [Keynote Talk]: Mathematical and Numerical Modelling of the Cardiovascular System: Macroscale, Mesoscale and Microscale Applications
Authors: Aymen Laadhari
Abstract:
The cardiovascular system is centered on the heart and is characterized by a very complex structure with different physical scales in space (e.g. micrometers for erythrocytes and centimeters for organs) and time (e.g. milliseconds for human brain activity and several years for development of some pathologies). The development and numerical implementation of mathematical models of the cardiovascular system is a tremendously challenging topic at the theoretical and computational levels, inducing consequently a growing interest over the past decade. The accurate computational investigations in both healthy and pathological cases of processes related to the functioning of the human cardiovascular system can be of great potential in tackling several problems of clinical relevance and in improving the diagnosis of specific diseases. In this talk, we focus on the specific task of simulating three particular phenomena related to the cardiovascular system on the macroscopic, mesoscopic and microscopic scales, respectively. Namely, we develop numerical methodologies tailored for the simulation of (i) the haemodynamics (i.e., fluid mechanics of blood) in the aorta and sinus of Valsalva interacting with highly deformable thin leaflets, (ii) the hyperelastic anisotropic behaviour of cardiomyocytes and the influence of calcium concentrations on the contraction of single cells, and (iii) the dynamics of red blood cells in microvasculature. For each problem, we present an appropriate fully Eulerian finite element methodology. We report several numerical examples to address in detail the relevance of the mathematical models in terms of physiological meaning and to illustrate the accuracy and efficiency of the numerical methods.Keywords: finite element method, cardiovascular system, Eulerian framework, haemodynamics, heart valve, cardiomyocyte, red blood cell
Procedia PDF Downloads 252697 Application of New Sprouted Wheat Brine for Delicatessen Products From Horse Meat, Beef and Pork
Authors: Gulmira Kenenbay, Urishbay Chomanov, Aruzhan Shoman, Rabiga Kassimbek
Abstract:
The main task of the meat-processing industry is the production of meat products as the main source of animal protein, ensuring the vital activity of the human body, in the required volumes, high quality, diverse assortment. Providing the population with high-quality food products what are biologically full, balanced in composition of basic nutrients and enriched by targeted physiologically active components, is one of the highest priority scientific and technical problems to be solved. In this regard, the formulation of a new brine from sprouted wheat for meat delicacies from horse meat, beef and pork has been developed. The new brine contains flavored aromatic ingredients, juice of the germinated wheat and vegetable juice. The viscosity of meat of horse meat, beef and pork were studied during massaging. Thermodynamic indices, water activity and binding energy of horse meat, beef and pork with application of new brine are investigated. A recipe for meat products with vegetable additives has been developed. Organoleptic evaluation of meat products was carried out. Physicochemical parameters of meat products with vegetable additives are carried out. Analysis of the obtained data shows that the values of the index aw (water activity) and the binding energy of moisture in the experimental samples of meat products are higher than in the control samples. It has been established by investigations that with increasing water activity and the binding energy of moisture, the tenderness of ready meat delicacies increases with the use of a new brine.Keywords: compounding, functional products, delicatessen products, brine, vegetable additives
Procedia PDF Downloads 178696 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem
Authors: Bidzina Matsaberidze
Abstract:
It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions
Procedia PDF Downloads 92695 The Utility and the Consequences of Counter Terrorism Financing
Authors: Fatemah Alzubairi
Abstract:
Terrorism financing is a theme that dramatically evolved post-9/11. Supra-national bodies, above all UN Security Council and the Financial Action Task Form (FATF), have established an executive-like mechanism, which allows blacklisting individuals and groups, freezing their funds, and restricting their travel, all of which have become part of states’ anti-terrorism frameworks. A number of problems arise from building counter-terrorism measures on the foundation of a vague definition of terrorism. This paper examines the utility and consequences of counter-terrorism financing with considering the lack of an international definition of terrorism. The main problem with national and international anti-terrorism legislation is the lack of a clear objective definition of terrorism. Most, if not all, national laws are broad and vague. Determining what terrorism remains the crucial underpinning of any successful discussion of counter-terrorism, and of the future success of counter-terrorist measures. This paper focuses on the legal and political consequences of equalizing the treatment of violent terrorist crimes, such as bombing, with non-violent terrorism-related crimes, such as funding terrorist groups. While both sorts of acts requires criminalization, treating them equally risks wrongfully or unfairly condemning innocent people who have associated with “terrorists” but are not involved in terrorist activities. This paper examines whether global obligations to counter terrorism financing focus on controlling terrorist groups more than terrorist activities. It also examines the utility of the obligations adopted by the UN Security Council and FATF, and whether they serve global security; or whether the utility is largely restricted to Western security, with little attention paid to the unique needs and demands of other regions.Keywords: counter-terrorism, definition of terrorism, FATF, security, terrorism financing, UN Security Council
Procedia PDF Downloads 324694 Exploring Paper Mill Sludge and Sugarcane Bagasse as Carrier Matrix in Solid State Fermentation for Carotenoid Pigment Production by Planococcus sp. TRC1
Authors: Subhasree Majumdar, Sovan Dey, Sayari Mukherjee, Sourav Dutta, Dalia Dasgupta Mandal
Abstract:
Bacterial isolates from Planococcus genus are known for the production of yellowish orange pigment that belongs to the carotenoid family. These pigments are of immense pharmacological importance as antioxidant, anticancer, eye and liver protective agent, etc. The production of this pigment in a cost effective manner is a challenging task. The present study explored paper mill sludge (PMS), a solid lignocellulosic waste generated in large quantities from pulp and paper mill industry as a substrate for carotenoid pigment production by Planococcus sp. TRC1. PMS was compared in terms of efficacy with sugarcane bagasse, which is a highly explored substrate for valuable product generation via solid state fermentation. The results showed that both the biomasses yielded the highest carotenoid during 48 hours of incubation, 31.6 mg/gm and 42.1 mg/gm for PMS and bagasse respectively. Compositional alterations of both the biomasses showed reduction in lignin, hemicellulose and cellulose content by 41%, 15%, 1% for PMS and 38%, 25% and 6% for sugarcane bagasse after 72 hours of incubation. Structural changes in the biomasses were examined by FT-IR, FESEM, and XRD which further confirmed modification of solid biomasses by bacterial isolate. This study revealed the potential of PMS to act as cheap substrate for carotenoid pigment production by Planococcus sp. TRC1, as it showed a significant production in comparison to sugarcane bagasse which gave only 1.3 fold higher production than PMS. Delignification of PMS by TRC1 during pigment production is another important finding for the reuse of this waste from the paper industry.Keywords: carotenoid, lignocellulosic, paper mill sludge, Planococcus sp. TRC1, solid state fermentation, sugarcane bagasse
Procedia PDF Downloads 235693 Omni-Modeler: Dynamic Learning for Pedestrian Redetection
Authors: Michael Karnes, Alper Yilmaz
Abstract:
This paper presents the application of the omni-modeler towards pedestrian redetection. The pedestrian redetection task creates several challenges when applying deep neural networks (DNN) due to the variety of pedestrian appearance with camera position, the variety of environmental conditions, and the specificity required to recognize one pedestrian from another. DNNs require significant training sets and are not easily adapted for changes in class appearances or changes in the set of classes held in its knowledge domain. Pedestrian redetection requires an algorithm that can actively manage its knowledge domain as individuals move in and out of the scene, as well as learn individual appearances from a few frames of a video. The Omni-Modeler is a dynamically learning few-shot visual recognition algorithm developed for tasks with limited training data availability. The Omni-Modeler adapts the knowledge domain of pre-trained deep neural networks to novel concepts with a calculated localized language encoder. The Omni-Modeler knowledge domain is generated by creating a dynamic dictionary of concept definitions, which are directly updatable as new information becomes available. Query images are identified through nearest neighbor comparison to the learned object definitions. The study presented in this paper evaluates its performance in re-identifying individuals as they move through a scene in both single-camera and multi-camera tracking applications. The results demonstrate that the Omni-Modeler shows potential for across-camera view pedestrian redetection and is highly effective for single-camera redetection with a 93% accuracy across 30 individuals using 64 example images for each individual.Keywords: dynamic learning, few-shot learning, pedestrian redetection, visual recognition
Procedia PDF Downloads 76692 Integrating Knowledge Distillation of Multiple Strategies
Authors: Min Jindong, Wang Mingxia
Abstract:
With the widespread use of artificial intelligence in life, computer vision, especially deep convolutional neural network models, has developed rapidly. With the increase of the complexity of the real visual target detection task and the improvement of the recognition accuracy, the target detection network model is also very large. The huge deep neural network model is not conducive to deployment on edge devices with limited resources, and the timeliness of network model inference is poor. In this paper, knowledge distillation is used to compress the huge and complex deep neural network model, and the knowledge contained in the complex network model is comprehensively transferred to another lightweight network model. Different from traditional knowledge distillation methods, we propose a novel knowledge distillation that incorporates multi-faceted features, called M-KD. In this paper, when training and optimizing the deep neural network model for target detection, the knowledge of the soft target output of the teacher network in knowledge distillation, the relationship between the layers of the teacher network and the feature attention map of the hidden layer of the teacher network are transferred to the student network as all knowledge. in the model. At the same time, we also introduce an intermediate transition layer, that is, an intermediate guidance layer, between the teacher network and the student network to make up for the huge difference between the teacher network and the student network. Finally, this paper adds an exploration module to the traditional knowledge distillation teacher-student network model. The student network model not only inherits the knowledge of the teacher network but also explores some new knowledge and characteristics. Comprehensive experiments in this paper using different distillation parameter configurations across multiple datasets and convolutional neural network models demonstrate that our proposed new network model achieves substantial improvements in speed and accuracy performance.Keywords: object detection, knowledge distillation, convolutional network, model compression
Procedia PDF Downloads 278691 A Pilot Study on Integration of Simulation in the Nursing Educational Program: Hybrid Simulation
Authors: Vesile Unver, Tulay Basak, Hatice Ayhan, Ilknur Cinar, Emine Iyigun, Nuran Tosun
Abstract:
The aim of this study is to analyze the effects of the hybrid simulation. In this simulation, types standardized patients and task trainers are employed simultaneously. For instance, in order to teach the IV activities standardized patients and IV arm models are used. The study was designed as a quasi-experimental research. Before the implementation an ethical permission was taken from the local ethical commission and administrative permission was granted from the nursing school. The universe of the study included second-grade nursing students (n=77). The participants were selected through simple random sample technique and total of 39 nursing students were included. The views of the participants were collected through a feedback form with 12 items. The form was developed by the authors and “Patient intervention self-confidence/competence scale”. Participants reported advantages of the hybrid simulation practice. Such advantages include the following: developing connections between the simulated scenario and real life situations in clinical conditions; recognition of the need for learning more about clinical practice. They all stated that the implementation was very useful for them. They also added three major gains; improvement of critical thinking skills (94.7%) and the skill of making decisions (97.3%); and feeling as if a nurse (92.1%). In regard to the mean scores of the participants in the patient intervention self-confidence/competence scale, it was found that the total mean score for the scale was 75.23±7.76. The findings obtained in the study suggest that the hybrid simulation has positive effects on the integration of theoretical and practical activities before clinical activities for the nursing students.Keywords: hybrid simulation, clinical practice, nursing education, nursing students
Procedia PDF Downloads 293690 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms
Authors: Bliss Singhal
Abstract:
Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression
Procedia PDF Downloads 82689 Digital Health During a Pandemic: Critical Analysis of the COVID-19 Contact Tracing Apps
Authors: Mohanad Elemary, Imose Itua, Rajeswari B. Matam
Abstract:
Virologists and public health experts have been predicting potential pandemics from coronaviruses for decades. The viruses which caused the SARS and MERS pandemics and the Nipah virus led to many lost lives, but still, the COVID-19 pandemic caused by the SARS-CoV2 virus surprised many scientific communities, experts, and governments with its ease of transmission and its pathogenicity. Governments of various countries reacted by locking down entire populations to their homes to combat the devastation caused by the virus, which led to a loss of livelihood and economic hardship to many individuals and organizations. To revive national economies and support their citizens in resuming their lives, governments focused on the development and use of contact tracing apps as a digital way to track and trace exposure. Google and Apple introduced the Exposure Notification Systems (ENS) framework. Independent organizations and countries also developed different frameworks for contact tracing apps. The efficiency, popularity, and adoption rate of these various apps have been different across countries. In this paper, we present a critical analysis of the different contact tracing apps with respect to their efficiency, adoption rate and general perception, and the governmental strategies and policies, which led to the development of the applications. When it comes to the European countries, each of them followed an individualistic approach to the same problem resulting in different realizations of a similarly functioning application with differing results of use and acceptance. The study conducted an extensive review of existing literature, policies, and reports across multiple disciplines, from which a framework was developed and then validated through interviews with six key stakeholders in the field, including founders and executives in digital health startups and corporates as well as experts from international organizations like The World Health Organization. A framework of best practices and tactics is the result of this research. The framework looks at three main questions regarding the contact tracing apps; how to develop them, how to deploy them, and how to regulate them. The findings are based on the best practices applied by governments across multiple countries, the mistakes they made, and the best practices applied in similar situations in the business world. The findings include multiple strategies when it comes to the development milestone regarding establishing frameworks for cooperation with the private sector and how to design the features and user experience of the app for a transparent, effective, and rapidly adaptable app. For the deployment section, several tactics were discussed regarding communication messages, marketing campaigns, persuasive psychology, and the initial deployment scale strategies. The paper also discusses the data privacy dilemma and how to build for a more sustainable system of health-related data processing and utilization. This is done through principles-based regulations specific for health data to allow for its avail for the public good. This framework offers insights into strategies and tactics that could be implemented as protocols for future public health crises and emergencies whether global or regional.Keywords: contact tracing apps, COVID-19, digital health applications, exposure notification system
Procedia PDF Downloads 135688 The Residual Effects of Special Merchandising Sections on Consumers' Shopping Behavior
Authors: Shih-Ching Wang, Mark Lang
Abstract:
This paper examines the secondary effects and consequences of special displays on subsequent shopping behavior. Special displays are studied as a prominent form of in-store or shopper marketing activity. Two experiments are performed using special value and special quality-oriented displays in an online simulated store environment. The impact of exposure to special displays on mindsets and resulting product choices are tested in a shopping task. Impact on store image is also tested. The experiments find that special displays do trigger shopping mindsets that affect product choices and shopping basket composition and value. There are intended and unintended positive and negative effects found. Special value displays improve store price image but trigger a price sensitive shopping mindset that causes more lower-priced items to be purchased, lowering total basket dollar value. Special natural food displays improve store quality image and trigger a quality-oriented mindset that causes fewer lower-priced items to be purchased, increasing total basket dollar value. These findings extend the theories of product categorization, mind-sets, and price sensitivity found in communication research into the retail store environment. Findings also warn retailers to consider the total effects and consequences of special displays when designing and executing in-store or shopper marketing activity.Keywords: special displays, mindset, shopping behavior, price consciousness, product categorization, store image
Procedia PDF Downloads 283687 'I Mean' in Teacher Questioning Sequences in Post-Task Discussions: A Conversation Analytic Study
Authors: Derya Duran, Christine Jacknick
Abstract:
Despite a growing body of research on classroom, especially language classroom interactions, much more is yet to be discovered on how interaction is organized in higher education settings. This study investigates how the discourse marker 'I mean' in teacher questioning turns functions as a resource to promote student participation as well as to enhance collective understanding in whole-class discussions. This paper takes a conversation analytic perspective, drawing on 30-hour video recordings of classroom interaction in an English as a medium of instruction university in Turkey. Two content classrooms (i.e., Guidance) were observed during an academic term. The course was offered to 4th year students (n=78) in the Faculty of Education; students were majoring in different subjects (i.e., Early Childhood Education, Foreign Language Education, Mathematics Education). Results of the study demonstrate the multi-functionality of discourse marker 'I mean' in teacher questioning turns. In the context of English as a medium of instruction classrooms where possible sources of confusion may occur, we found that 'I mean' is primarily used to indicate upcoming adjustments. More specifically, it is employed for a variety of interactional purposes such as elaboration, clarification, specification, reformulation, and reference to the instructional activity. The study sheds light on the multiplicity of functions of the discourse marker in academic interactions and it uncovers how certain linguistic resources serve functions to the organization of repair such as the maintenance of understanding in classroom interaction. In doing so, it also shows the ways in which participation is routinely enacted in shared interactional events through linguistic resources.Keywords: conversation analysis, discourse marker, English as a medium of instruction, repair
Procedia PDF Downloads 161686 Investigating Complement Clause Choice in Written Educated Nigerian English (ENE)
Authors: Juliet Udoudom
Abstract:
Inappropriate complement selection constitutes one of the major features of non-standard complementation in the Nigerian users of English output of sentence construction. This paper investigates complement clause choice in Written Educated Nigerian English (ENE) and offers some results. It aims at determining preferred and dispreferred patterns of complement clause selection in respect of verb heads in English by selected Nigerian users of English. The complementation data analyzed in this investigation were obtained from experimental tasks designed to elicit complement categories of Verb – Noun -, Adjective – and Prepositional – heads in English. Insights from the Government – Binding relations were employed in analyzing data, which comprised responses obtained from one hundred subjects to a picture elicitation exercise, a grammaticality judgement test, and a free composition task. The findings indicate a general tendency for clausal complements (CPs) introduced by the complementizer that to be preferred by the subjects studied. Of the 235 tokens of clausal complements which occurred in our corpus, 128 of them representing 54.46% were CPs headed by that, while whether – and if-clauses recorded 31.07% and 8.94%, respectively. The complement clause-type which recorded the lowest incidence of choice was the CP headed by the Complementiser, for with a 5.53% incident of occurrence. Further findings from the study indicate that semantic features of relevant embedding verb heads were not taken into consideration in the choice of complementisers which introduce the respective complement clauses, hence the that-clause was chosen to complement verbs like prefer. In addition, the dispreferred choice of the for-clause is explicable in terms of the fact that the respondents studied regard ‘for’ as a preposition, and not a complementiser.Keywords: complement, complement clause complement selection, complementisers, government-binding
Procedia PDF Downloads 188685 Labor Productivity and Organization Performance in Specialty Trade Construction: The Moderating Effect of Safety
Authors: Shalini Priyadarshini
Abstract:
The notion of performance measurement has held great appeal for the industry and research communities alike. This idea is also true for the construction sector, and some propose that performance measurement and productivity analysis are two separate management functions, where productivity is a subset of performance, the latter requiring comprehensive analysis of comparable factors. Labor productivity is considered one of the best indicators of production efficiency. The construction industry continues to account for a disproportionate share of injuries and illnesses despite adopting several technological and organizational interventions that promote worker safety. Specialty trades contractors typically complete a large fraction of work on any construction project, but insufficient body of work exists that address subcontractor safety and productivity issues. Literature review has revealed the possibility of a relation between productivity, safety and other factors and their links to project, organizational, task and industry performance. This research posits that there is an association between productivity and performance at project as well as organizational levels in the construction industry. Moreover, prior exploration of the importance of safety within the performance-productivity framework has been anecdotal at best. Using structured questionnaire survey and organization- and project level data, this study, which is a combination of cross-sectional and longitudinal research designs, addresses the identified research gap and models the relationship between productivity, safety, and performance with a focus on specialty trades in the construction sector. Statistical analysis is used to establish a correlation between the variables of interest. This research identifies the need for developing and maintaining productivity and safety logs for smaller businesses. Future studies can design and develop research to establish causal relationships between these variables.Keywords: construction, safety, productivity, performance, specialty trades
Procedia PDF Downloads 278684 Extreme Value Theory Applied in Reliability Analysis: Case Study of Diesel Generator Fans
Authors: Jelena Vucicevic
Abstract:
Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. In this paper, the results for the reliability of diesel generator fans were calculated through Extreme Value Theory. The Extreme Value Theory is not widely used in the engineering field. Its usage is well known in other areas such as hydrology, meteorology, finance. The significance of this theory is in the fact that unlike the other statistical methods it is focused on rare and extreme values, and not on average. It should be noted that this theory is not designed exclusively for extreme events, but for extreme values in any event. Therefore, this is a great opportunity to apply the theory and test if it could be applied in this situation. The significance of the work is the calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know the time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. The results achieved in this method will show the approximation of time for which the fans will work as they should, and the percentage of probability of fans working more than certain estimated time. Extreme Value Theory can be applied not only for rare and extreme events, but for any event that has values which we can consider as extreme.Keywords: extreme value theory, lifetime, reliability analysis, statistic, time to failure
Procedia PDF Downloads 328683 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-Fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.Keywords: fact checking, claim verification, deep learning, natural language processing
Procedia PDF Downloads 62682 Device-integrated Micro-thermocouples for Reliable Temperature Measurement of GaN HEMTs
Authors: Hassan Irshad Bhatti, Saravanan Yuvaraja, Xiaohang Li
Abstract:
GaN-based devices, such as high electron mobility transistors (HEMTs), offer superior characteristics for high-power, high-frequency, and high-temperature applications [1]. However, this exceptional electrical performance is compromised by undesirable self-heating effects under high-power applications [2, 3]. Some of the issues caused by self-heating are current collapse, thermal runway and performance degradation [4, 5]. Therefore, accurate and reliable methods for measuring the temperature of individual devices on a chip are needed to monitor and control the thermal behavior of GaN-based devices [6]. Temperature measurement at the micro/nanoscale is a challenging task that requires specialized techniques such as Infrared microscopy, Raman thermometry, and thermoreflectance. Recently, micro-thermocouples (MTCs) have attracted considerable attention due to their advantages of simplicity, low cost, high sensitivity, and compatibility with standard fabrication processes [7, 8]. A micro-thermocouple is a junction of two different metal thin films, which generates a Seebeck voltage related to the temperature difference between a hot and cold zone. Integrating MTC in a device allows local temperature to be measured with high sensitivity and accuracy [9]. This work involves the fabrication and integration of micro-thermocouples (MTCs) to measure the channel temperature of GaN HEMT. Our fabricated MTC (Platinum-Chromium junction) has shown a sensitivity of 16.98 µV/K and can measure device channel temperature with high precision and accuracy. The temperature information obtained using this sensor can help improve GaN-based devices and provide thermal engineers with useful insights for optimizing their designs.Keywords: Electrical Engineering, Thermal engineering, Power Devices, Semiconuctors
Procedia PDF Downloads 20681 Study of Methods to Reduce Carbon Emissions in Structural Engineering
Authors: Richard Krijnen, Alan Wang
Abstract:
As the world is aiming to reach net zero around 2050, structural engineers must begin finding solutions to contribute to this global initiative. Approximately 40% of global energy-related emissions are due to buildings and construction, and a building’s structure accounts for 50% of its embodied carbon, which indicates that structural engineers are key contributors to finding solutions to reach carbon neutrality. However, this task presents a multifaceted challenge as structural engineers must navigate technical, safety and economic considerations while striving to reduce emissions. This study reviews several options and considerations to reduce carbon emissions that structural engineers can use in their future designs without compromising the structural integrity of their proposed design. Low-carbon structures should adhere to several guiding principles. Firstly, prioritize the selection of materials with low carbon footprints, such as recyclable or alternative materials. Optimization of design and engineering methods is crucial to minimize material usage. Encouraging the use of recyclable and renewable materials reduces dependency on natural resources. Energy efficiency is another key consideration involving the design of structures to minimize energy consumption across various systems. Choosing local materials and minimizing transportation distances help in reducing carbon emissions during transport. Innovation, such as pre-fabrication and modular design or low-carbon concrete, can further cut down carbon emissions during manufacturing and construction. Collaboration among stakeholders and sharing experiences and resources are essential for advancing the development and application of low-carbon structures. This paper identifies current available tools and solutions to reduce embodied carbon in structures, which can be used as part of daily structural engineering practice.Keywords: efficient structural design, embodied carbon, low-carbon material, sustainable structural design
Procedia PDF Downloads 42680 Working with Children and Young People as a much Neglected Area of Education within the Social Studies Curriculum in Poland
Authors: Marta Czechowska-Bieluga
Abstract:
Social work education in Poland focuses mostly on developing competencies that address the needs of individuals and families affected by a variety of life's problems. As a result of the ageing of the Polish population, much attention is equally devoted to adults, including the elderly. However, social work with children and young people is the area of education which should be given more consideration. Social work students are mostly trained to cater to the needs of families and the competencies aimed to respond to the needs of children and young people do not receive enough attention and are only offered as elective classes. This paper strives to review the social work programmes offered by the selected higher education institutions in Poland in terms of social work training aimed at helping children and young people to address their life problems. The analysis conducted in this study indicates that university education for social work focuses on training professionals who will provide assistance only to adults. Due to changes in the social and political situation, including, in particular, changes in social policy implemented for the needy, it is necessary to extend this area of education to include the specificity of the support for children and young people; especially, in the light of the appearance of new support professions within the area of social work. For example, family assistants, whose task is to support parents in performing their roles as guardians and educators, also assist children. Therefore, it becomes necessary to equip social work professionals with competencies which include issues related to the quality of life of underage people living in families. Social work curricula should be extended to include the issues of child and young person development and the patterns governing this phase of life.Keywords: social work education, social work programmes, social worker, university
Procedia PDF Downloads 289679 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 207678 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)
Procedia PDF Downloads 383677 Understanding Jordanian Women's Values and Beliefs Related to Prevention and Early Detection of Breast Cancer
Authors: Khlood F. Salman, Richard Zoucha, Hani Nawafleh
Abstract:
Introduction: Jordan ranks the fourth highest breast cancer prevalence after Lebanon, Bahrain, and Kuwait. Considerable evidence showed that cultural, ethnic, and economic differences influence a woman’s practice to early detection and prevention of breast cancer. Objectives: To understand women’s health beliefs and values in relation to early detection of breast cancer; and to explore the impact of these beliefs on their decisions regarding reluctance or acceptance of early detection measures such as mammogram screening. Design: A qualitative focused ethnography was used to collect data for this study. Settings: The study was conducted in the second largest city surrounded by a large rural area in Ma’an- Jordan. Participants: A total of twenty seven women, with no history of breast cancer, between the ages of 18 and older, who had prior health experience with health providers, and were willing to share elements of personal health beliefs related to breast health within the larger cultural context. The participants were recruited using the snowball method and words of mouth. Data collection and analysis: A short questionnaire was designed to collect data related to socio demographic status (SDQ) from all participants. A Semi-structured interviews guide was used to elicit data through interviews with the informants. Nvivo10 a data manager was utilized to assist with data analysis. Leininger’s four phases of qualitative data analysis was used as a guide for the data analysis. The phases used to analyze the data included: 1) Collecting and documenting raw data, 2) Identifying of descriptors and categories according to the domains of inquiry and research questions. Emic and etic data is coded for similarities and differences, 3) Identifying patterns and contextual analysis, discover saturation of ideas and recurrent patterns, and 4) Identifying themes and theoretical formulations and recommendations. Findings: Three major themes were emerged within the cultural and religious context; 1. Fear, denial, embarrassment and lack of knowledge were common perceptions of Ma’anis’ women regarding breast health and screening mammography, 2. Health care professionals in Jordan were not quick to offer information and education about breast cancer and screening, and 3. Willingness to learn about breast health and cancer prevention. Conclusion: The study indicated the disparities between the infrastructure and resourcing in rural and urban areas of Jordan, knowledge deficit related to breast cancer, and lack of education about breast health may impact women’s decision to go for a mammogram screening. Cultural beliefs, fear, embarrassments as well as providers lack of focus on breast health were significant contributors against practicing breast health. Health providers and policy makers should provide resources for the establishment health education programs regarding breast cancer early detection and mammography screening. Nurses should play a major role in delivering health education about breast health in general and breast cancer in particular. A culturally appropriate health awareness messages can be used in creating educational programs which can be employed at the national levels.Keywords: breast health, beliefs, cultural context, ethnography, mammogram screening
Procedia PDF Downloads 298676 Investigation on Remote Sense Surface Latent Heat Temperature Associated with Pre-Seismic Activities in Indian Region
Authors: Vijay S. Katta, Vinod Kushwah, Rudraksh Tiwari, Mulayam Singh Gaur, Priti Dimri, Ashok Kumar Sharma
Abstract:
The formation process of seismic activities because of abrupt slip on faults, tectonic plate moments due to accumulated stress in the Earth’s crust. The prediction of seismic activity is a very challenging task. We have studied the changes in surface latent heat temperatures which are observed prior to significant earthquakes have been investigated and could be considered for short term earthquake prediction. We analyzed the surface latent heat temperature (SLHT) variation for inland earthquakes occurred in Chamba, Himachal Pradesh (32.5 N, 76.1E, M-4.5, depth-5km) nearby the main boundary fault region, the data of SLHT have been taken from National Center for Environmental Prediction (NCEP). In this analysis, we have calculated daily variations with surface latent heat temperature (0C) in the range area 1⁰x1⁰ (~120/KM²) with the pixel covering epicenter of earthquake at the center for a three months period prior to and after the seismic activities. The mean value during that period has been considered in order to take account of the seasonal effect. The monthly mean has been subtracted from daily value to study anomalous behavior (∆SLHT) of SLHT during the earthquakes. The results found that the SLHTs adjacent the epicenters all are anomalous high value 3-5 days before the seismic activities. The abundant surface water and groundwater in the epicenter and its adjacent region can provide the necessary condition for the change of SLHT. To further confirm the reliability of SLHT anomaly, it is necessary to explore its physical mechanism in depth by more earthquakes cases.Keywords: surface latent heat temperature, satellite data, earthquake, magnetic storm
Procedia PDF Downloads 134675 I Look Powerful So You Will Yield to Me: The Effects of Embodied Power and the Perception of Power on Conflict Management
Authors: Fai-Ho E. Choi, Wing-Tung Au
Abstract:
This study investigated the effects of embodiment on conflict management. As shown in the research literature, the physiological (i.e. bodily postures) can affect the emotional and cognitive proceedings of human beings, but little has been shown on whether such effects would have ramifications in decision-making related to other individuals. In this study, conflict is defined as when two parties have seemingly incompatible goals, and the two have to deal with each other in order to maximize one’s own gain. In a matched-gender experiment, university undergraduate students were randomly assigned to either the high power condition or the low power condition, with participants in each condition instructed to perform a fix set of bodily postures that would either embody them with a high sense of power or a low sense of power. One high-power participant would pair up with a low-power participant to engage in an integrative bargaining task and a dictator game. Participants also filled out a pre-trial questionnaire and a post-trial questionnaire measuring general sense of power, self-esteem and self-efficacy. Personality was controlled for. Results are expected to support our hypotheses that people who are embodied with power will be more unyielding in a conflict management situation, and that people who are dealing with another person embodied with power will be more yielding in a conflict management situation. As conflicts arise frequently both within and between organizations, a better understanding of how human beings function in conflicts is important. This study should provide evidence that bodily postures can influence the perceived sense of power of the parties involved and hence influence the conflict outcomes. Future research needs to be conducted to investigate further how people perceive themselves and how they perceive their opponents in conflicts, such that we can come up with a behavioral theory of conflict management.Keywords: conflict management, embodiment, negotiation, perception
Procedia PDF Downloads 445674 A Real-Time Moving Object Detection and Tracking Scheme and Its Implementation for Video Surveillance System
Authors: Mulugeta K. Tefera, Xiaolong Yang, Jian Liu
Abstract:
Detection and tracking of moving objects are very important in many application contexts such as detection and recognition of people, visual surveillance and automatic generation of video effect and so on. However, the task of detecting a real shape of an object in motion becomes tricky due to various challenges like dynamic scene changes, presence of shadow, and illumination variations due to light switch. For such systems, once the moving object is detected, tracking is also a crucial step for those applications that used in military defense, video surveillance, human computer interaction, and medical diagnostics as well as in commercial fields such as video games. In this paper, an object presents in dynamic background is detected using adaptive mixture of Gaussian based analysis of the video sequences. Then the detected moving object is tracked using the region based moving object tracking and inter-frame differential mechanisms to address the partial overlapping and occlusion problems. Firstly, the detection algorithm effectively detects and extracts the moving object target by enhancing and post processing morphological operations. Secondly, the extracted object uses region based moving object tracking and inter-frame difference to improve the tracking speed of real-time moving objects in different video frames. Finally, the plotting method was applied to detect the moving objects effectively and describes the object’s motion being tracked. The experiment has been performed on image sequences acquired both indoor and outdoor environments and one stationary and web camera has been used.Keywords: background modeling, Gaussian mixture model, inter-frame difference, object detection and tracking, video surveillance
Procedia PDF Downloads 477673 The Study of Formal and Semantic Errors of Lexis by Persian EFL Learners
Authors: Mohammad J. Rezai, Fereshteh Davarpanah
Abstract:
Producing a text in a language which is not one’s mother tongue can be a demanding task for language learners. Examining lexical errors committed by EFL learners is a challenging area of investigation which can shed light on the process of second language acquisition. Despite the considerable number of investigations into grammatical errors, few studies have tackled formal and semantic errors of lexis committed by EFL learners. The current study aimed at examining Persian learners’ formal and semantic errors of lexis in English. To this end, 60 students at three different proficiency levels were asked to write on 10 different topics in 10 separate sessions. Finally, 600 essays written by Persian EFL learners were collected, acting as the corpus of the study. An error taxonomy comprising formal and semantic errors was selected to analyze the corpus. The formal category covered misselection and misformation errors, while the semantic errors were classified into lexical, collocational and lexicogrammatical categories. Each category was further classified into subcategories depending on the identified errors. The results showed that there were 2583 errors in the corpus of 9600 words, among which, 2030 formal errors and 553 semantic errors were identified. The most frequent errors in the corpus included formal error commitment (78.6%), which were more prevalent at the advanced level (42.4%). The semantic errors (21.4%) were more frequent at the low intermediate level (40.5%). Among formal errors of lexis, the highest number of errors was devoted to misformation errors (98%), while misselection errors constituted 2% of the errors. Additionally, no significant differences were observed among the three semantic error subcategories, namely collocational, lexical choice and lexicogrammatical. The results of the study can shed light on the challenges faced by EFL learners in the second language acquisition process.Keywords: collocational errors, lexical errors, Persian EFL learners, semantic errors
Procedia PDF Downloads 142672 A Literature Review about Responsible Third Cycle Supervision
Authors: Johanna Lundqvist
Abstract:
Third cycle supervision is a multifaceted and complex task for supervisors in higher education. It progresses over several years and is affected by several proximal and distal factors. It can result in positive learning outcomes for doctoral students and high-quality publications. However, not all doctoral students thrive during their doctoral studies; nor do they all complete their studies. This is problematic for both the individuals themselves as well as society at large: doctoral students are valuable and important in current research, future research and higher education. The aim of this literature review is to elucidate what responsible third cycle supervision can include and be in practice. The question posed is as follows: according to recent literature, what is it that characterises responsible third cycle supervision in which doctoral students can thrive and develop their research knowledge and skills? A literature review was conducted, and the data gathered from the literature regarding responsible third cycle supervision was analysed by means of a thematic analysis. The analysis was inspired by the notion of responsible inclusion outlined by David Mitchell. In this study, the term literature refers to research articles and regulations. The results (preliminary) show that responsible third cycle supervision is associated with a number of interplaying factors (themes). These are as follows: committed supervisors and doctoral students; a clear vision and research problem; an individual study plan; adequate resources; interaction processes and constructive feedback; creativity; cultural awareness; respect and research ethics; systematic quality work and improvement efforts; focus on overall third cycle learning goals; and focus on research presentations and publications. Thus, responsible third cycle supervision can occur if these factors are realized in practice. This literature review is of relevance to evaluators, researchers, and management in higher education, as well as third cycle supervisors.Keywords: doctoral student, higher education, third cycle supervisors, third cycle programmes
Procedia PDF Downloads 137671 The Influence of the Islamic State (IS) on India: Recent Developments and Challenges
Authors: Alvite Singh Ningthoujam
Abstract:
The most recent terror phenomenon, which is also known as the Islamic State of Iraq and Syria (ISIS), or Islamic State (IS), has its influence felt in South Asia. This dreaded Sunni militant group, today, has become a concern in India as well. Already affected by various terror activities in the country, the influence of the IS on the radicalised Muslim youths in India has been watched closely by the security agencies. There had already been a few IS-related incidents in India due to which this issue has emerged as a threat or challenge to India’s internal security. The rapid radicalisation of youths in a few states where there are sizeable Muslim populations has gone, to some extent, in favour of the IS, particularly in the terror outfit’s recruitment process. What has added to the worry of the Indian security agencies is the announcement of the Al-Qaeda leader, Ayman al-Zawahari, of the creation of the Al-Qaeda in the Indian Subcontinent. In fact, this is a worrisome factor as both the militant groups, that is, al-Qaeda and ISIS, have a similar objective to target India and to turn this South Asian country as one of the recruiting grounds for extremists. There is also a possibility that an Indian Mujahedeen (IM) man was believed to be instrumental in recruiting for the ISIS poor Muslims in a few Indian states. If this nexus between ISIS and India’s home-grown terror groups manages to establish a robust link, then the headache of combating such amalgamated force will be a hard task for Indian security agencies. In the wake of the above developments, this paper would seek to analyse the developing trend in India in regard to IS. It would also bring out the reasons as to why further penetration of the IS influence on India would be a grave concern in the internal security of the country. The last section of the paper would highlight the steps that have been taken by the Indian government to tackle this menace effectively.Keywords: India, Islamic State, Muslim, Security
Procedia PDF Downloads 376670 Construction and Validation of a Hybrid Lumbar Spine Model for the Fast Evaluation of Intradiscal Pressure and Mobility
Authors: Dicko Ali Hamadi, Tong-Yette Nicolas, Gilles Benjamin, Faure Francois, Palombi Olivier
Abstract:
A novel hybrid model of the lumbar spine, allowing fast static and dynamic simulations of the disc pressure and the spine mobility, is introduced in this work. Our contribution is to combine rigid bodies, deformable finite elements, articular constraints, and springs into a unique model of the spine. Each vertebra is represented by a rigid body controlling a surface mesh to model contacts on the facet joints and the spinous process. The discs are modeled using a heterogeneous tetrahedral finite element model. The facet joints are represented as elastic joints with six degrees of freedom, while the ligaments are modeled using non-linear one-dimensional elastic elements. The challenge we tackle is to make these different models efficiently interact while respecting the principles of Anatomy and Mechanics. The mobility, the intradiscal pressure, the facet joint force and the instantaneous center of rotation of the lumbar spine are validated against the experimental and theoretical results of the literature on flexion, extension, lateral bending as well as axial rotation. Our hybrid model greatly simplifies the modeling task and dramatically accelerates the simulation of pressure within the discs, as well as the evaluation of the range of motion and the instantaneous centers of rotation, without penalizing precision. These results suggest that for some types of biomechanical simulations, simplified models allow far easier modeling and faster simulations compared to usual full-FEM approaches without any loss of accuracy.Keywords: hybrid, modeling, fast simulation, lumbar spine
Procedia PDF Downloads 306