Search results for: fused deep representations
1629 Beauty Representation and Body Politic of Women Writers in Magdalene
Authors: Putri Alya Ramadhani
Abstract:
This research analysed how women writers represent their beauty in a platform called Magdalene. With the vision “Supporting diversity, empowering minds,” Magdalene is a new media that seeks to represent women's voices rarely heard in mainstream media. This research elaborates further on how women writers, through their writing, use their body politic to subvert patriarchal values. This research used a qualitative method with an explorative design by using text analysis based on the representation theory of Stuart Hall and in-dept-interview with Women Writers in Magdalene. The result illustrated that women writers represent their beauty in Magdalene to subvert body and beauty-representation in mainstream discourse. Furthermore, the authors have identified an identity negotiation as tension from inevitable oppression and power towards and from women’s bodies. In addition, Women Writers showed the power of their bodies through the redefinition of beauty practices and self. Hence, they subvert body dichotomy to redefine body values in society. In conclusion, this study shows various representations of beauty and body that are underrepresented in the mainstream media through the innovative new medium, Magdalena.Keywords: women writers, beauty-representation, body politic, new media, identity negotiation
Procedia PDF Downloads 1741628 Expression Level of Dehydration-Responsive Element Binding/DREB Gene of Some Local Corn Cultivars from Kisar Island-Maluku Indonesia Using Quantitative Real-Time PCR
Authors: Hermalina Sinay, Estri L. Arumingtyas
Abstract:
The research objective was to determine the expression level of dehydration responsive element binding/DREB gene of local corn cultivars from Kisar Island Maluku. The study design was a randomized block design with single factor consist of six local corn cultivars obtained from farmers in Kisar Island and one reference varieties wich has been released by the government as a drought-tolerant varieties and obtained from Cereal Crops Research Institute (ICERI) Maros South Sulawesi. Leaf samples were taken is the second leaf after the flag leaf at the 65 days after planting. Isolation of total RNA from leaf samples was carried out according to the protocols of the R & A-BlueTM Total RNA Extraction Kit and was used as a template for cDNA synthesis. The making of cDNA from total RNA was carried out according to the protocol of One-Step Reverse Transcriptase PCR Premix Kit. Real Time-PCR was performed on cDNA from reverse transcription followed the procedures of Real MODTM Green Real-Time PCR Master Mix Kit. Data obtained from the real time-PCR results were analyzed using relative quantification method based on the critical point / Cycle Threshold (CP / CT). The results of gene expression analysis of DREB gene showed that the expression level of the gene was highest obtained at Deep Yellow local corn cultivar, and the lowest one was obtained at the Rubby Brown Cob cultivar. It can be concluded that the expression level of DREB gene of Deep Yellow local corn cultivar was highest than other local corn cultivars and Srikandi variety as a reference variety.Keywords: expression, level, DREB gene, local corn cultivars, Kisar Island, Maluku
Procedia PDF Downloads 2991627 A Constructed Wetland as a Reliable Method for Grey Wastewater Treatment in Rwanda
Authors: Hussein Bizimana, Osman Sönmez
Abstract:
Constructed wetlands are current the most widely recognized waste water treatment option, especially in developing countries where they have the potential for improving water quality and creating valuable wildlife habitat in ecosystem with treatment requirement relatively simple for operation and maintenance cost. Lack of grey waste water treatment facilities in Kigali İnstitute of Science and Technology in Rwanda, causes pollution in the surrounding localities of Rugunga sector, where already a problem of poor sanitation is found. In order to treat grey water produced at Kigali İnstitute of Science and Technology, with high BOD concentration, high nutrients concentration and high alkalinity; a Horizontal Sub-surface Flow pilot-scale constructed wetland was designed and can operate in Kigali İnstitute of Science and Technology. The study was carried out in a sedimentation tank of 5.5 m x 1.42 m x 1.2 m deep and a Horizontal Sub-surface constructed wetland of 4.5 m x 2.5 m x 1.42 m deep. The grey waste water flow rate of 2.5 m3/d flew through vegetated wetland and sandy pilot plant. The filter media consisted of 0.6 to 2 mm of coarse sand, 0.00003472 m/s of hydraulic conductivity and cattails (Typha latifolia spp) were used as plants species. The effluent flow rate of the plant is designed to be 1.5 m3/ day and the retention time will be 24 hrs. 72% to 79% of BOD, COD, and TSS removals are estimated to be achieved, while the nutrients (Nitrogen and Phosphate) removal is estimated to be in the range of 34% to 53%. Every effluent characteristic will meet exactly the Rwanda Utility Regulatory Agency guidelines primarily because the retention time allowed is enough to make the reduction of contaminants within effluent raw waste water. Treated water reuse system was developed where water will be used in the campus irrigation system again.Keywords: constructed wetlands, hydraulic conductivity, grey waste water, cattails
Procedia PDF Downloads 6081626 The Conception of the Students about the Presence of Mental Illness at School
Authors: Aline Giardin, Maria Rosa Chitolina, Maria Catarina Zanini
Abstract:
In this paper, we analyze the conceptions of high school students about mental health issues, and discuss the creation of mental basic health programs in schools. We base our findings in a quantitative survey carried out by us with 156 high school students of CTISM (Colégio Técnico Industrial de Santa Maria) school, located in Santa Maria city, Brazil. We have found that: (a) 28 students relate the subject ‘mental health’ with psychiatric hospitals and lunatic asylums; (b) 28 students have relatives affected by mental diseases; (c) 76 students believe that mental patients, if treated, can live a healthy life; (d) depression, schizophrenia and bipolar disorder are the most cited diseases; (e) 84 students have contact with mental patients, but know nothing about the disease; (f) 123 students have never been instructed about mental diseases while in the school; and (g) 135 students think that a mental health program would be important in the school. We argue that these numbers reflect a vision of mental health that can be related to the reductionist education still present in schools and to the lack of integration between health professionals, sciences teachers, and students. Furthermore, this vision can also be related to a stigmatization process, which interferes with the interactions and with the representations regarding mental disorders and mental patients in society.Keywords: mental health, schools, mental illness, conception
Procedia PDF Downloads 4691625 The Right to State Lands: A Case Study of a Squatter Community in Egypt
Authors: Salwa Salman
Abstract:
On February 2016, Egypt’s President Abdel Fattah Al-Sisi ordered the former Prime Minister, Ibrahim Mehleb, to establish a committee responsible for retrieving looted state lands or providing squatters with land titles according to their individual cases. The specificity of desert lands emerges from its unique position in both Islamic law and Egypt’s Civil Code. In Egypt, desert lands can be transferred to private ownership through peaceful occupation and cultivation. This study explores the (re-) conceptualization of land rights, state territoriality, and sovereignty as a part of an emerging narrative on informal land tenure. Through the lens of an informal settlement, the study employs methodological insights from studies in the anthropology of development and their interpretation of Foucauldian discourse analysis to examine official representations on squatting over state lands and put them in conversation with individual narratives on land ownership and dispossession. It also employs Bruno Latour’s actor-network theory to explore the development of social networks through primary land contracts and informal local resource management.Keywords: State lands, squatter community, Islamic law, Egypt’s Civil Code
Procedia PDF Downloads 1711624 A Deep Learning Approach to Real Time and Robust Vehicular Traffic Prediction
Authors: Bikis Muhammed, Sehra Sedigh Sarvestani, Ali R. Hurson, Lasanthi Gamage
Abstract:
Vehicular traffic events have overly complex spatial correlations and temporal interdependencies and are also influenced by environmental events such as weather conditions. To capture these spatial and temporal interdependencies and make more realistic vehicular traffic predictions, graph neural networks (GNN) based traffic prediction models have been extensively utilized due to their capability of capturing non-Euclidean spatial correlation very effectively. However, most of the already existing GNN-based traffic prediction models have some limitations during learning complex and dynamic spatial and temporal patterns due to the following missing factors. First, most GNN-based traffic prediction models have used static distance or sometimes haversine distance mechanisms between spatially separated traffic observations to estimate spatial correlation. Secondly, most GNN-based traffic prediction models have not incorporated environmental events that have a major impact on the normal traffic states. Finally, most of the GNN-based models did not use an attention mechanism to focus on only important traffic observations. The objective of this paper is to study and make real-time vehicular traffic predictions while incorporating the effect of weather conditions. To fill the previously mentioned gaps, our prediction model uses a real-time driving distance between sensors to build a distance matrix or spatial adjacency matrix and capture spatial correlation. In addition, our prediction model considers the effect of six types of weather conditions and has an attention mechanism in both spatial and temporal data aggregation. Our prediction model efficiently captures the spatial and temporal correlation between traffic events, and it relies on the graph attention network (GAT) and Bidirectional bidirectional long short-term memory (Bi-LSTM) plus attention layers and is called GAT-BILSTMA.Keywords: deep learning, real time prediction, GAT, Bi-LSTM, attention
Procedia PDF Downloads 711623 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework
Authors: Abbas Raza Ali
Abstract:
Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation
Procedia PDF Downloads 1751622 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation
Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong
Abstract:
Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation
Procedia PDF Downloads 1901621 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 421620 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis
Authors: Mehrnaz Mostafavi
Abstract:
The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans
Procedia PDF Downloads 1001619 Image Segmentation with Deep Learning of Prostate Cancer Bone Metastases on Computed Tomography
Authors: Joseph M. Rich, Vinay A. Duddalwar, Assad A. Oberai
Abstract:
Prostate adenocarcinoma is the most common cancer in males, with osseous metastases as the commonest site of metastatic prostate carcinoma (mPC). Treatment monitoring is based on the evaluation and characterization of lesions on multiple imaging studies, including Computed Tomography (CT). Monitoring of the osseous disease burden, including follow-up of lesions and identification and characterization of new lesions, is a laborious task for radiologists. Deep learning algorithms are increasingly used to perform tasks such as identification and segmentation for osseous metastatic disease and provide accurate information regarding metastatic burden. Here, nnUNet was used to produce a model which can segment CT scan images of prostate adenocarcinoma vertebral bone metastatic lesions. nnUNet is an open-source Python package that adds optimizations to deep learning-based UNet architecture but has not been extensively combined with transfer learning techniques due to the absence of a readily available functionality of this method. The IRB-approved study data set includes imaging studies from patients with mPC who were enrolled in clinical trials at the University of Southern California (USC) Health Science Campus and Los Angeles County (LAC)/USC medical center. Manual segmentation of metastatic lesions was completed by an expert radiologist Dr. Vinay Duddalwar (20+ years in radiology and oncologic imaging), to serve as ground truths for the automated segmentation. Despite nnUNet’s success on some medical segmentation tasks, it only produced an average Dice Similarity Coefficient (DSC) of 0.31 on the USC dataset. DSC results fell in a bimodal distribution, with most scores falling either over 0.66 (reasonably accurate) or at 0 (no lesion detected). Applying more aggressive data augmentation techniques dropped the DSC to 0.15, and reducing the number of epochs reduced the DSC to below 0.1. Datasets have been identified for transfer learning, which involve balancing between size and similarity of the dataset. Identified datasets include the Pancreas data from the Medical Segmentation Decathlon, Pelvic Reference Data, and CT volumes with multiple organ segmentations (CT-ORG). Some of the challenges of producing an accurate model from the USC dataset include small dataset size (115 images), 2D data (as nnUNet generally performs better on 3D data), and the limited amount of public data capturing annotated CT images of bone lesions. Optimizations and improvements will be made by applying transfer learning and generative methods, including incorporating generative adversarial networks and diffusion models in order to augment the dataset. Performance with different libraries, including MONAI and custom architectures with Pytorch, will be compared. In the future, molecular correlations will be tracked with radiologic features for the purpose of multimodal composite biomarker identification. Once validated, these models will be incorporated into evaluation workflows to optimize radiologist evaluation. Our work demonstrates the challenges of applying automated image segmentation to small medical datasets and lays a foundation for techniques to improve performance. As machine learning models become increasingly incorporated into the workflow of radiologists, these findings will help improve the speed and accuracy of vertebral metastatic lesions detection.Keywords: deep learning, image segmentation, medicine, nnUNet, prostate carcinoma, radiomics
Procedia PDF Downloads 961618 Aire-Dependent Transcripts have Shortened 3’UTRs and Show Greater Stability by Evading Microrna-Mediated Repression
Authors: Clotilde Guyon, Nada Jmari, Yen-Chin Li, Jean Denoyel, Noriyuki Fujikado, Christophe Blanchet, David Root, Matthieu Giraud
Abstract:
Aire induces ectopic expression of a large repertoire of tissue-specific antigen (TSA) genes in thymic medullary epithelial cells (MECs), driving immunological self-tolerance in maturing T cells. Although important mechanisms of Aire-induced transcription have recently been disclosed through the identification and the study of Aire’s partners, the fine transcriptional functions underlied by a number of them and conferred to Aire are still unknown. Alternative cleavage and polyadenylation (APA) is an essential mRNA processing step regulated by the termination complex consisting of 85 proteins, 10 of them have been related to Aire. We evaluated APA in MECs in vivo by microarray analysis with mRNA-spanning probes and RNA deep sequencing. We uncovered the preference of Aire-dependent transcripts for short-3’UTR isoforms and for proximal poly(A) site selection marked by the increased binding of the cleavage factor Cstf-64. RNA interference of the 10 Aire-related proteins revealed that Clp1, a member of the core termination complex, exerts a profound effect on short 3’UTR isoform preference. Clp1 is also significantly upregulated in the MECs compared to 25 mouse tissues in which we found that TSA expression is associated with longer 3’UTR isoforms. Aire-dependent transcripts escape a global 3’UTR lengthening associated with MEC differentiation, thereby potentiating the repressive effect of microRNAs that are globally upregulated in mature MECs. Consistent with these findings, RNA deep sequencing of actinomycinD-treated MECs revealed the increased stability of short 3’UTR Aire-induced transcripts, resulting in TSA transcripts accumulation and contributing for their enrichment in the MECs.Keywords: Aire, central tolerance, miRNAs, transcription termination
Procedia PDF Downloads 3831617 The Influence of Microscopic Features on the Self-Cleaning Ability of Developed 3D Printed Fabric-Like Structures Using Different Printing Parameters
Authors: Ayat Adnan Atwah, Muhammad A. Khan
Abstract:
Self-cleaning surfaces are getting significant attention in industrial fields. Especially for textile fabrics, it is observed that self-cleaning textile fabric surfaces are created by manipulating the surface features with the help of coatings and nanoparticles, which are considered costly and far more complicated. However, controlling the fabrication parameters of textile fabrics at the microscopic level by exploring the potential for self-cleaning has not been addressed. This study aimed to establish the context of self-cleaning textile fabrics by controlling the fabrication parameters of the textile fabric at the microscopic level. Therefore, 3D-printed textile fabrics were fabricated using the low-cost fused filament fabrication (FFF) technique. The printing parameters, such as orientation angle (O), layer height (LH), and extruder width (EW), were used to control the microscopic features of the printed fabrics. The combination of three printing parameters was created to provide the best self-cleaning textile fabric surface: (LH) (0.15, 0.13, 0.10 mm) and (EW) (0.5, 0.4, 0.3 mm) along with two different (O) of (45º and 90º). Three different thermoplastic flexible filament materials were used: (TPU 98A), (TPE felaflex), and (TPC flex45). The printing parameters were optimised to get the optimum self-cleaning ability of the printed specimens. Furthermore, the impact of these characteristics on mechanical strength at the fabric-woven structure level was investigated. The study revealed that the printing parameters significantly affect the self-cleaning properties after adjusting the selected combination of layer height, extruder width, and printing orientation. A linear regression model was effectively developed to demonstrate the association between 3D printing parameters (layer height, extruder width, and orientation). According to the experimental results, (TPE felaflex) has a better self-cleaning ability than the other two materials.Keywords: 3D printing, self-cleaning fabric, microscopic features, printing parameters, fabrication
Procedia PDF Downloads 901616 Characteristics and Challenges of Post-Burn Contractures in Adults and Children: A Descriptive Study
Authors: Hardisiswo Soedjana, Inne Caroline
Abstract:
Deep dermal or full thickness burns are inevitably lead to post-burn contractures. These contractures remain to be one of the most concerning late complications of burn injuries. Surgical management includes releasing the contracture followed by resurfacing the defect accompanied by post-operative rehabilitation. Optimal treatment of post-burn contractures depends on the characteristics of the contractures. This study is aimed to describe clinical characteristics, problems, and management of post-burn contractures in adults and children. A retrospective analysis was conducted from medical records of patients suffered from contractures after burn injuries admitted to Hasan Sadikin general hospital between January 2016 and January 2018. A total of 50 patients with post burn contractures were included in the study. There were 17 adults and 33 children. Most patients were male, whose age range within 15-59 years old and 5-9 years old. Educational background was mostly senior high school among adults, while there was only one third of children who have entered school. Etiology of burns was predominantly flame in adults (82.3%); whereas flame and scald were the leading cause of burn injury in children (11%). Based on anatomical regions, hands were the most common affected both in adults (35.2%) and children (48.5%). Contractures were identified in 6-12 months since the initial burns. Most post-burn hand contractures were resurfaced with full-thickness skin graft (FTSG) both in adults and children. There were 11 patients who presented with recurrent contracture after previous history of contracture release. Post-operative rehabilitation was conducted for all patients; however, it is important to highlight that it is still challenging to control splinting and exercise when patients are discharged and especially the compliance in children. In order to improve quality of life in patients with history of deep burn injuries, prevention of contractures should begin right after acute care has been established. Education for the importance of splinting and exercise should be administered as comprehensible as possible for adult patients and parents of pediatric patients.Keywords: burn, contracture, education, exercise, splinting
Procedia PDF Downloads 1301615 Tracking of Intramuscular Stem Cells by Magnetic Resonance Diffusion Weighted Imaging
Authors: Balakrishna Shetty
Abstract:
Introduction: Stem Cell Imaging is a challenging field since the advent of Stem Cell treatment in humans. Series of research on tagging and tracking the stem cells has not been very effective. The present study is an effort by the authors to track the stem cells injected into calf muscles by Magnetic Resonance Diffusion Weighted Imaging. Materials and methods: Stem Cell injection deep into the calf muscles of patients with peripheral vascular disease is one of the recent treatment modalities followed in our institution. 5 patients who underwent deep intramuscular injection of stem cells as treatment were included for this study. Pre and two hours Post injection MRI of bilateral calf regions was done using 1.5 T Philips Achieva, 16 channel system using 16 channel torso coils. Axial STIR, Axial Diffusion weighted images with b=0 and b=1000 values with back ground suppression (DWIBS sequence of Philips MR Imaging Systems) were obtained at 5 mm interval covering the entire calf. The invert images were obtained for better visualization. 120ml of autologous bone marrow derived stem cells were processed and enriched under c-GMP conditions and reduced to 40ml solution containing mixture of above stem cells. Approximately 40 to 50 injections, each containing 0.75ml of processed stem cells, was injected with marked grids over the calf region. Around 40 injections, each of 1ml normal saline, is injected into contralateral leg as control. Results: Significant Diffusion hyper intensity is noted at the site of injected stem cells. No hyper intensity noted before the injection and also in the control side where saline was injected conclusion: This is one of the earliest studies in literature showing diffusion hyper intensity in intramuscularly injected stem cells. The advantages and deficiencies in this study will be discussed during the presentation.Keywords: stem cells, imaging, DWI, peripheral vascular disease
Procedia PDF Downloads 741614 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning
Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem
Abstract:
The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.Keywords: connectivism, learning analytics, lifelong learning, social semantic web
Procedia PDF Downloads 2141613 Identification of Deposition Sequences of the Organic Content of Lower Albian-Cenomanian Age in Northern Tunisia: Correlation between Molecular and Stratigraphic Fossils
Authors: Tahani Hallek, Dhaou Akrout, Riadh Ahmadi, Mabrouk Montacer
Abstract:
The present work is an organic geochemical study of the Fahdene Formation outcrops at the Mahjouba region belonging to the Eastern part of the Kalaat Senan structure in northwestern Tunisia (the Kef-Tedjerouine area). The analytical study of the organic content of the samples collected, allowed us to point out that the Formation in question is characterized by an average to good oil potential. This fossilized organic matter has a mixed origin (type II and III), as indicated by the relatively high values of hydrogen index. This origin is confirmed by the C29 Steranes abundance and also by tricyclic terpanes C19/(C19+C23) and tetracyclic terpanes C24/(C24+C23) ratios, that suggest a marine environment of deposit with high plants contribution. We have demonstrated that the heterogeneity of organic matter between the marine aspect, confirmed by the presence of foraminifera, and the continental contribution, is the result of an episodic anomaly in relation to the sequential stratigraphy. Given that the study area is defined as an outer platform forming a transition zone between a stable continental domain to the south and a deep basin to the north, we have explained the continental contribution by successive forced regressions, having blocked the albian transgression, allowing the installation of the lowstand system tracts. This aspect is represented by the incised valleys filling, in direct contact with the pelagic and deep sea facies. Consequently, the Fahdene Formation, in the Kef-Tedjerouine area, consists of transgressive system tracts (TST) brutally truncated by extras of continental progradation; resulting in a mixed influence deposition having retained a heterogeneous organic material.Keywords: molecular geochemistry, biomarkers, forced regression, deposit environment, mixed origin, Northern Tunisia
Procedia PDF Downloads 2491612 In-Silico Fusion of Bacillus Licheniformis Chitin Deacetylase with Chitin Binding Domains from Chitinases
Authors: Keyur Raval, Steffen Krohn, Bruno Moerschbacher
Abstract:
Chitin, the biopolymer of the N-acetylglucosamine, is the most abundant biopolymer on the planet after cellulose. Industrially, chitin is isolated and purified from the shell residues of shrimps. A deacetylated derivative of chitin i.e. chitosan has more market value and applications owing to it solubility and overall cationic charge compared to the parent polymer. This deacetylation on an industrial scale is performed chemically using alkalis like sodium hydroxide. This reaction not only is hazardous to the environment owing to negative impact on the marine ecosystem. A greener option to this process is the enzymatic process. In nature, the naïve chitin is converted to chitosan by chitin deacetylase (CDA). This enzymatic conversion on the industrial scale is however hampered by the crystallinity of chitin. Thus, this enzymatic action requires the substrate i.e. chitin to be soluble which is technically difficult and an energy consuming process. We in this project wanted to address this shortcoming of CDA. In lieu of this, we have modeled a fusion protein with CDA and an auxiliary protein. The main interest being to increase the accessibility of the enzyme towards crystalline chitin. A similar fusion work with chitinases had improved the catalytic ability towards insoluble chitin. In the first step, suitable partners were searched through the protein data bank (PDB) wherein the domain architecture were sought. The next step was to create the models of the fused product using various in silico techniques. The models were created by MODELLER and evaluated for properties such as the energy or the impairment of the binding sites. A fusion PCR has been designed based on the linker sequences generated by MODELLER and would be tested for its activity towards insoluble chitin.Keywords: chitin deacetylase, modeling, chitin binding domain, chitinases
Procedia PDF Downloads 2421611 A Framework of Dynamic Rule Selection Method for Dynamic Flexible Job Shop Problem by Reinforcement Learning Method
Authors: Rui Wu
Abstract:
In the volatile modern manufacturing environment, new orders randomly occur at any time, while the pre-emptive methods are infeasible. This leads to a real-time scheduling method that can produce a reasonably good schedule quickly. The dynamic Flexible Job Shop problem is an NP-hard scheduling problem that hybrid the dynamic Job Shop problem with the Parallel Machine problem. A Flexible Job Shop contains different work centres. Each work centre contains parallel machines that can process certain operations. Many algorithms, such as genetic algorithms or simulated annealing, have been proposed to solve the static Flexible Job Shop problems. However, the time efficiency of these methods is low, and these methods are not feasible in a dynamic scheduling problem. Therefore, a dynamic rule selection scheduling system based on the reinforcement learning method is proposed in this research, in which the dynamic Flexible Job Shop problem is divided into several parallel machine problems to decrease the complexity of the dynamic Flexible Job Shop problem. Firstly, the features of jobs, machines, work centres, and flexible job shops are selected to describe the status of the dynamic Flexible Job Shop problem at each decision point in each work centre. Secondly, a framework of reinforcement learning algorithm using a double-layer deep Q-learning network is applied to select proper composite dispatching rules based on the status of each work centre. Then, based on the selected composite dispatching rule, an available operation is selected from the waiting buffer and assigned to an available machine in each work centre. Finally, the proposed algorithm will be compared with well-known dispatching rules on objectives of mean tardiness, mean flow time, mean waiting time, or mean percentage of waiting time in the real-time Flexible Job Shop problem. The result of the simulations proved that the proposed framework has reasonable performance and time efficiency.Keywords: dynamic scheduling problem, flexible job shop, dispatching rules, deep reinforcement learning
Procedia PDF Downloads 1071610 Optimizing the Efficiency of Measuring Instruments in Ouagadougou-Burkina Faso
Authors: Moses Emetere, Marvel Akinyemi, S. E. Sanni
Abstract:
At the moment, AERONET or AMMA database shows a large volume of data loss. With only about 47% data set available to the scientist, it is evident that accurate nowcast or forecast cannot be guaranteed. The calibration constants of most radiosonde or weather stations are not compatible with the atmospheric conditions of the West African climate. A dispersion model was developed to incorporate salient mathematical representations like a Unified number. The Unified number was derived to describe the turbulence of the aerosols transport in the frictional layer of the lower atmosphere. Fourteen years data set from Multi-angle Imaging SpectroRadiometer (MISR) was tested using the dispersion model. A yearly estimation of the atmospheric constants over Ouagadougou using the model was obtained with about 87.5% accuracy. It further revealed that the average atmospheric constant for Ouagadougou-Niger is a_1 = 0.626, a_2 = 0.7999 and the tuning constants is n_1 = 0.09835 and n_2 = 0.266. Also, the yearly atmospheric constants affirmed the lower atmosphere of Ouagadougou is very dynamic. Hence, it is recommended that radiosonde and weather station manufacturers should constantly review the atmospheric constant over a geographical location to enable about eighty percent data retrieval.Keywords: aerosols retention, aerosols loading, statistics, analytical technique
Procedia PDF Downloads 3151609 Closed Incision Negative Pressure Therapy Dressing as an Approach to Manage Closed Sternal Incisions in High-Risk Cardiac Patients: A Multi-Centre Study in the UK
Authors: Rona Lee Suelo-Calanao, Mahmoud Loubani
Abstract:
Objective: Sternal wound infection (SWI) following cardiac operation has a significant impact on patient morbidity and mortality. It also contributes to longer hospital stays and increased treatment costs. SWI management is mainly focused on treatment rather than prevention. This study looks at the effect of closed incision negative pressure therapy (ciNPT) dressing to help reduce the incidence of superficial SWI in high-risk patients after cardiac surgery. The ciNPT dressing was evaluated at 3 cardiac hospitals in the United Kingdom". Methods: All patients who had cardiac surgery from 2013 to 2021 were included in the study. The patients were classed as high risk if they have two or more of the recognised risk factors: obesity, age above 80 years old, diabetes, and chronic obstructive pulmonary disease. Patients receiving standard dressing (SD) and patients using ciNPT were propensity matched, and the Fisher’s exact test (two-tailed) and unpaired T-test were used to analyse categorical and continuous data, respectively. Results: There were 766 matched cases in each group. Total SWI incidences are lower in the ciNPT group compared to the SD group (43 (5.6%) vs 119 (15.5%), P=0.0001). There are fewer deep sternal wound infections (14(1.8%) vs. 31(4.04%), p=0.0149) and fewer superficial infections (29(3.7%) vs. 88 (11.4%), p=0.0001) in the ciNPT group compared to the SD group. However, the ciNPT group showed a longer average length of stay (11.23 ± 13 days versus 9.66 ± 10 days; p=0.0083) and higher mean logistic EuroSCORE (11.143 ± 13 versus 8.094 ± 11; p=0.0001). Conclusion: Utilization of ciNPT as an approach to help reduce the incidence of superficial and deep SWI may be effective in high-risk patients requiring cardiac surgery.Keywords: closed incision negative pressure therapy, surgical wound infection, cardiac surgery complication, high risk cardiac patients
Procedia PDF Downloads 961608 Analysis of the Accuracy of Earth Movement with Drone Surveys
Authors: Raúl Pereda García, Julio Manuel de Luis Ruiz, Elena Castillo López, Rubén Pérez Álvarez, Felipe Piña García
Abstract:
New technologies for the capture of point clouds have experienced a great advance in recent years. In this way, its use has been extended in geomatics, providing measurement solutions that have been popularized without there being, many times, a detailed study of its accuracy. This research focuses on the study of the viability of topographic works with drones incorporating different sensors sensitive to the visible spectrum. The fundamentals have been applied to a road, located in Cantabria (Spain), where a platform extension and the reform of a riprap were being constructed. A total of six flights were made during two months, all of them with GPS as part of the photogrammetric process, and the results were contrasted with those measured with total station. The obtained results show that the choice of the camera and the planning of the flight have an important impact on the accuracy. In fact, the representations with a level of detail corresponding to 1/1000 scale are admissible, depending on the existing vegetation, and obtaining better results in the area of the riprap. This set of techniques is, therefore, suitable for the control of earthworks in road works but with certain limitations which are exposed in this paper.Keywords: drone, earth movement control, global position system, surveying technology.
Procedia PDF Downloads 1841607 Analogical Reasoning on Preschoolers’ Linguistic Performance
Authors: Yenie Norambuena
Abstract:
Analogical reasoning is a cognitive process that consists of structured comparisons of mental representations and scheme construction. Because of its heuristic function, it is ubiquitous in cognition and could play an important role in language development. The use of analogies is expressed early in children and this behavior is also reflected in language, suggesting a possible way to understand the complex links between thought and language. The current research examines factors of verbal and non-verbal reasoning that should be taken into consideration in the study of language development for their relations and predictive value. The study was conducted with 48 Chilean preschoolers (Spanish speakers) from 4 to 6-year-old. We assessed children’s verbal analogical reasoning, non-verbal analogical reasoning and linguistics skills (Listening Comprehension, Phonemic awareness, Alphabetic principle, Syllabification, Lexical repetition and Lexical decision). The results evidenced significant correlations between analogical reasoning factors and linguistic skills and they can predict linguistic performance mainly on oral comprehension, lexical decision and phonological skills. These findings suggest a fundamental interrelationship between analogical reasoning and linguistic performance on children’s and points to the need to consider this cognitive process in comprehensive theories of children's language development.Keywords: verbal analogical reasoning, non-verbal analogical reasoning, linguistic skills, language development
Procedia PDF Downloads 2661606 Flood Devastation Assessment Through Mapping in Nigeria-2022 using Geospatial Techniques
Authors: Hafiz Muhammad Tayyab Bhatti, Munazza Usmani
Abstract:
One of nature's most destructive occurrences, floods do immense damage to communities and economic losses. Nigeria country, specifically southern Nigeria, is known for being prone to flooding. Even though periodic flooding occurs in Nigeria frequently, the floods of 2022 were the worst since those in 2012. Flood vulnerability analysis and mapping are still lacking in this region due to the very limited historical hydrological measurements and surveys on the effects of floods, which makes it difficult to develop and put into practice efficient flood protection measures. Remote sensing and Geographic Information Systems (GIS) are useful approaches to detecting, determining, and estimating the flood extent and its impacts. In this study, NOAA VIIR has been used to extract the flood extent using the flood water fraction data and afterward fused with GIS data for some zonal statistical analysis. The estimated possible flooding areas are validated using satellite imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS). The goal is to map and studied flood extent, flood hazards, and their effects on the population, schools, and health facilities for each state of Nigeria. The resulting flood hazard maps show areas with high-risk levels clearly and serve as an important reference for planning and implementing future flood mitigation and control strategies. Overall, the study demonstrated the viability of using the chosen GIS and remote sensing approaches to detect possible risk regions to secure local populations and enhance disaster response capabilities during natural disasters.Keywords: flood hazards, remote sensing, damage assessment, GIS, geospatial analysis
Procedia PDF Downloads 1371605 The Evaluation of Superiority of Foot Local Anesthesia Method in Dairy Cows
Authors: Samaneh Yavari, Christiane Pferrer, Elisabeth Engelke, Alexander Starke, Juergen Rehage
Abstract:
Background: Nowadays, bovine limb interventions, especially any claw surgeries, raises selection of the most qualified and appropriate local anesthesia technique applicable for any superficial or deep interventions of the limbs. Currently, two local anesthesia methods of Intravenous Regional Anesthesia (IVRA), as well as Nerve Blocks, have been routine to apply. However, the lack of studies investigating the quality and duration as well as quantity and onset of full (complete) local anesthesia, is noticeable. Therefore, the aim of our study was comparing the onset and quality of both IVRA and our modified NBA at the hind limb of dairy cows. For this abstract, only the onset of full local anesthesia would be consider. Materials and Methods: For that reason, we used six healthy non pregnant non lactating Holestein Frisian cows in a cross-over study design. Those cows divided into two groups to receive IVRA and our modified four-point NBA. For IVRA, 20 ml procaine without epinephrine was injected into the vein digitalis dorsalis communis III and for our modified four-point NBA, 10-15 ml procaine without epinephrine preneurally to the nerves, superficial and deep peroneal as well as lateral and medial branches of metatarsal nerves. For pain stimulation, electrical stimulator Grass S48 was applied. Results: The results of electrical stimuli revealed the faster onset of full local anesthesia (p < 0.05) by application of our modified NBA in comparison to IVRA about 10 minutes. Conclusion and discussion: Despite of available references showing faster onset of foot local anesthesia of IVRA, our study demonstrated that our modified four point NBA not only can be well known as a standard foot local anesthesia method applicable to desensitize the hind limb of dairy cows, but also, selection of this modified validated local anesthesia method can lead to have a faster start of complete desensitization of distal hind limb that is remarkable in any bovine limb interventions under time constraint.Keywords: IVRA, four point NBA, dairy cow, hind limb, full onset
Procedia PDF Downloads 1511604 Source Identification Model Based on Label Propagation and Graph Ordinary Differential Equations
Authors: Fuyuan Ma, Yuhan Wang, Junhe Zhang, Ying Wang
Abstract:
Identifying the sources of information dissemination is a pivotal task in the study of collective behaviors in networks, enabling us to discern and intercept the critical pathways through which information propagates from its origins. This allows for the control of the information’s dissemination impact in its early stages. Numerous methods for source detection rely on pre-existing, underlying propagation models as prior knowledge. Current models that eschew prior knowledge attempt to harness label propagation algorithms to model the statistical characteristics of propagation states or employ Graph Neural Networks (GNNs) for deep reverse modeling of the diffusion process. These approaches are either deficient in modeling the propagation patterns of information or are constrained by the over-smoothing problem inherent in GNNs, which limits the stacking of sufficient model depth to excavate global propagation patterns. Consequently, we introduce the ODESI model. Initially, the model employs a label propagation algorithm to delineate the distribution density of infected states within a graph structure and extends the representation of infected states from integers to state vectors, which serve as the initial states of nodes. Subsequently, the model constructs a deep architecture based on GNNs-coupled Ordinary Differential Equations (ODEs) to model the global propagation patterns of continuous propagation processes. Addressing the challenges associated with solving ODEs on graphs, we approximate the analytical solutions to reduce computational costs. Finally, we conduct simulation experiments on two real-world social network datasets, and the results affirm the efficacy of our proposed ODESI model in source identification tasks.Keywords: source identification, ordinary differential equations, label propagation, complex networks
Procedia PDF Downloads 201603 Risk Assessment Tools Applied to Deep Vein Thrombosis Patients Treated with Warfarin
Authors: Kylie Mueller, Nijole Bernaitis, Shailendra Anoopkumar-Dukie
Abstract:
Background: Vitamin K antagonists particularly warfarin is the most frequently used oral medication for deep vein thrombosis (DVT) treatment and prophylaxis. Time in therapeutic range (TITR) of the international normalised ratio (INR) is widely accepted as a measure to assess the quality of warfarin therapy. Multiple factors can affect warfarin control and the subsequent adverse outcomes including thromboembolic and bleeding events. Predictor models have been developed to assess potential contributing factors and measure the individual risk of these adverse events. These predictive models have been validated in atrial fibrillation (AF) patients, however, there is a lack of literature on whether these can be successfully applied to other warfarin users including DVT patients. Therefore, the aim of the study was to assess the ability of these risk models (HAS BLED and CHADS2) to predict haemorrhagic and ischaemic incidences in DVT patients treated with warfarin. Methods: A retrospective analysis of DVT patients receiving warfarin management by a private pathology clinic was conducted. Data was collected from November 2007 to September 2014 and included demographics, medical and drug history, INR targets and test results. Patients receiving continuous warfarin therapy with an INR reference range between 2.0 and 3.0 were included in the study with mean TITR calculated using the Rosendaal method. Bleeding and thromboembolic events were recorded and reported as incidences per patient. The haemorrhagic risk model HAS BLED and ischaemic risk model CHADS2 were applied to the data. Patients were then stratified into either the low, moderate, or high-risk categories. The analysis was conducted to determine if a correlation existed between risk assessment tool and patient outcomes. Data was analysed using GraphPad Instat Version 3 with a p value of <0.05 considered to be statistically significant. Patient characteristics were reported as mean and standard deviation for continuous data and categorical data reported as number and percentage. Results: Of the 533 patients included in the study, there were 268 (50.2%) female and 265 (49.8%) male patients with a mean age of 62.5 years (±16.4). The overall mean TITR was 78.3% (±12.7) with an overall haemorrhagic incidence of 0.41 events per patient. For the HAS BLED model, there was a haemorrhagic incidence of 0.08, 0.53, and 0.54 per patient in the low, moderate and high-risk categories respectively showing a statistically significant increase in incidence with increasing risk category. The CHADS2 model showed an increase in ischaemic events according to risk category with no ischaemic events in the low category, and an ischaemic incidence of 0.03 in the moderate category and 0.47 high-risk categories. Conclusion: An increasing haemorrhagic incidence correlated to an increase in the HAS BLED risk score in DVT patients treated with warfarin. Furthermore, a greater incidence of ischaemic events occurred in patients with an increase in CHADS2 category. In an Australian population of DVT patients, the HAS BLED and CHADS2 accurately predicts incidences of haemorrhage and ischaemic events respectively.Keywords: anticoagulant agent, deep vein thrombosis, risk assessment, warfarin
Procedia PDF Downloads 2631602 Management Methods of Food Losses in Polish Processing Plants
Authors: Beata Bilska, Marzena Tomaszewska, Danuta Kolozyn-Krajewska
Abstract:
Food loss and food waste are a global problem of the modern economy. The research undertaken aimed to analyze how food is handled in catering establishments when it comes to food waste and to demonstrate the main ways of management with foods/dishes not served to consumers. A survey study was conducted from January to June 2019. The selection of catering establishments participating in the study was deliberate. The study included establishments located only in Mazowieckie Voivodeship (Poland). Forty-two completed questionnaires were collected. In some questions, answers were based on a 5-point scale of 1 to 5 (from "always" / "every day" to "never"). The survey also included closed questions with a suggested cafeteria of answers. The respondents stated that in their workplaces, dishes served cold and hot ready meals are discarded every day or almost every day (23.7% and 20.5% of answers respectively). A procedure most frequently used for dealing with dishes not served to consumers on a given day is their storage at a cool temperature until the following day. In the research, 1/5 of respondents admitted that consumers "always" or "usually" leave uneaten meals on their plates, and over 41% "sometimes" do so. It was found additionally that food not used in the foodservice sector is most often thrown into a public container for rubbish. Most often thrown into the public container (with communal trash) were: expired products (80.0%), plate waste (80.0%) and inedible products (fruit and vegetable peels, eggshells) (77.5%). Most frequently into the container dedicated only to food waste were thrown out used deep-frying oil (62.5%). 10% of respondents indicated that inedible products in their workplaces are allocated for animal feeds. Food waste in the foodservice sector remains an insufficiently studied issue, as owners of these objects are often unwilling to disclose data about the subject. Incorrect ways of management with foods not served to consumers were observed. There is a need to develop educational activities for employees and management in the context of food waste management in the foodservice sector.Keywords: food waste, inedible products, plate waste, used deep-frying oil
Procedia PDF Downloads 1251601 Modeling Studies on the Elevated Temperatures Formability of Tube Ends Using RSM
Authors: M. J. Davidson, N. Selvaraj, L. Venugopal
Abstract:
The elevated temperature forming studies on the expansion of thin walled tubes have been studied in the present work. The influence of process parameters namely the die angle, the die ratio and the operating temperatures on the expansion of tube ends at elevated temperatures is carried out. The range of operating parameters have been identified by perfoming extensive simulation studies. The hot forming parameters have been evaluated for AA2014 alloy for performing the simulation studies. Experimental matrix has been developed from the feasible range got from the simulation results. The design of experiments is used for the optimization of process parameters. Response Surface Method’s (RSM) and Box-Behenken design (BBD) is used for developing the mathematical model for expansion. Analysis of variance (ANOVA) is used to analyze the influence of process parameters on the expansion of tube ends. The effect of various process combinations of expansion are analyzed through graphical representations. The developed model is found to be appropriate as the coefficient of determination value is very high and is equal to 0.9726. The predicted values are found to coincide well with the experimental results, within acceptable error limits.Keywords: expansion, optimization, Response Surface Method (RSM), ANOVA, bbd, residuals, regression, tube
Procedia PDF Downloads 5091600 Detection of Safety Goggles on Humans in Industrial Environment Using Faster-Region Based on Convolutional Neural Network with Rotated Bounding Box
Authors: Ankit Kamboj, Shikha Talwar, Nilesh Powar
Abstract:
To successfully deliver our products in the market, the employees need to be in a safe environment, especially in an industrial and manufacturing environment. The consequences of delinquency in wearing safety glasses while working in industrial plants could be high risk to employees, hence the need to develop a real-time automatic detection system which detects the persons (violators) not wearing safety glasses. In this study a convolutional neural network (CNN) algorithm called faster region based CNN (Faster RCNN) with rotated bounding box has been used for detecting safety glasses on persons; the algorithm has an advantage of detecting safety glasses with different orientation angles on the persons. The proposed method of rotational bounding boxes with a convolutional neural network first detects a person from the images, and then the method detects whether the person is wearing safety glasses or not. The video data is captured at the entrance of restricted zones of the industrial environment (manufacturing plant), which is further converted into images at 2 frames per second. In the first step, the CNN with pre-trained weights on COCO dataset is used for person detection where the detections are cropped as images. Then the safety goggles are labelled on the cropped images using the image labelling tool called roLabelImg, which is used to annotate the ground truth values of rotated objects more accurately, and the annotations obtained are further modified to depict four coordinates of the rectangular bounding box. Next, the faster RCNN with rotated bounding box is used to detect safety goggles, which is then compared with traditional bounding box faster RCNN in terms of detection accuracy (average precision), which shows the effectiveness of the proposed method for detection of rotatory objects. The deep learning benchmarking is done on a Dell workstation with a 16GB Nvidia GPU.Keywords: CNN, deep learning, faster RCNN, roLabelImg rotated bounding box, safety goggle detection
Procedia PDF Downloads 130