Search results for: sol-gel processing
2731 Contextual SenSe Model: Word Sense Disambiguation using Sense and Sense Value of Context Surrounding the Target
Authors: Vishal Raj, Noorhan Abbas
Abstract:
Ambiguity in NLP (Natural language processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential am-biguities. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a novel method to create an affinity matrix to calculate the affinity be-tween any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an al-gorithm to create the sense clusters of tokens using affinity matrix under hierar-chy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contex-tual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and chal-lenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.Keywords: word sense disambiguation (wsd), contextual sense model (csm), most frequent sense (mfs), part of speech (pos), natural language processing (nlp), oov (out of vocabulary), lemma_pos (a token where lemma and pos of word are joined by underscore), information retrieval (ir), machine translation (mt)
Procedia PDF Downloads 1072730 Enhancing the Flotation of Fine and Ultrafine Pyrite Particles Using Electrolytically Generated Bubbles
Authors: Bogale Tadesse, Krutik Parikh, Ndagha Mkandawire, Boris Albijanic, Nimal Subasinghe
Abstract:
It is well established that the floatability and selectivity of mineral particles are highly dependent on the particle size. Generally, a particle size of 10 micron is considered as the critical size below which both flotation selectivity and recovery decline sharply. It is widely accepted that the majority of ultrafine particles, including highly liberated valuable minerals, will be lost in tailings during a conventional flotation process. This is highly undesirable particularly in the processing of finely disseminated complex and refractory ores where there is a requirement for fine grinding in order to liberate the valuable minerals. In addition, the continuing decline in ore grade worldwide necessitates intensive processing of low grade mineral deposits. Recent advances in comminution allow the economic grinding of particles down to 10 micron sizes to enhance the probability of liberating locked minerals from low grade ores. Thus, it is timely that the flotation of fine and ultrafine particles is improved in order to reduce the amount of valuable minerals lost as slimes. It is believed that the use of fine bubbles in flotation increases the bubble-particle collision efficiency and hence the flotation performance. Electroflotation, where bubbles are generated by the electrolytic breakdown of water to produce oxygen and hydrogen gases, leads to the formation of extremely finely dispersed gas bubbles with dimensions varying from 5 to 95 micron. The sizes of bubbles generated by this method are significantly smaller than those found in conventional flotation (> 600 micron). In this study, microbubbles generated by electrolysis of water were injected into a bench top flotation cell to assess the performance electroflotation in enhancing the flotation of fine and ultrafine pyrite particles of sizes ranging from 5 to 53 micron. The design of the cell and the results from optimization of the process variables such as current density, pH, percent solid and particle size will be presented at this conference.Keywords: electroflotation, fine bubbles, pyrite, ultrafine particles
Procedia PDF Downloads 3352729 3D Microscopy, Image Processing, and Analysis of Lymphangiogenesis in Biological Models
Authors: Thomas Louis, Irina Primac, Florent Morfoisse, Tania Durre, Silvia Blacher, Agnes Noel
Abstract:
In vitro and in vivo lymphangiogenesis assays are essential for the identification of potential lymphangiogenic agents and the screening of pharmacological inhibitors. In the present study, we analyse three biological models: in vitro lymphatic endothelial cell spheroids, in vivo ear sponge assay, and in vivo lymph node colonisation by tumour cells. These assays provide suitable 3D models to test pro- and anti-lymphangiogenic factors or drugs. 3D images were acquired by confocal laser scanning and light sheet fluorescence microscopy. Virtual scan microscopy followed by 3D reconstruction by image aligning methods was also used to obtain 3D images of whole large sponge and ganglion samples. 3D reconstruction, image segmentation, skeletonisation, and other image processing algorithms are described. Fixed and time-lapse imaging techniques are used to analyse lymphatic endothelial cell spheroids behaviour. The study of cell spatial distribution in spheroid models enables to detect interactions between cells and to identify invasion hierarchy and guidance patterns. Global measurements such as volume, length, and density of lymphatic vessels are measured in both in vivo models. Branching density and tortuosity evaluation are also proposed to determine structure complexity. Those properties combined with vessel spatial distribution are evaluated in order to determine lymphangiogenesis extent. Lymphatic endothelial cell invasion and lymphangiogenesis were evaluated under various experimental conditions. The comparison of these conditions enables to identify lymphangiogenic agents and to better comprehend their roles in the lymphangiogenesis process. The proposed methodology is validated by its application on the three presented models.Keywords: 3D image segmentation, 3D image skeletonisation, cell invasion, confocal microscopy, ear sponges, light sheet microscopy, lymph nodes, lymphangiogenesis, spheroids
Procedia PDF Downloads 3772728 Neural Network Mechanisms Underlying the Combination Sensitivity Property in the HVC of Songbirds
Authors: Zeina Merabi, Arij Dao
Abstract:
The temporal order of information processing in the brain is an important code in many acoustic signals, including speech, music, and animal vocalizations. Despite its significance, surprisingly little is known about its underlying cellular mechanisms and network manifestations. In the songbird telencephalic nucleus HVC, a subset of neurons shows temporal combination sensitivity (TCS). These neurons show a high temporal specificity, responding differently to distinct patterns of spectral elements and their combinations. HVC neuron types include basal-ganglia-projecting HVCX, forebrain-projecting HVCRA, and interneurons (HVC¬INT), each exhibiting distinct cellular, electrophysiological and functional properties. In this work, we develop conductance-based neural network models connecting the different classes of HVC neurons via different wiring scenarios, aiming to explore possible neural mechanisms that orchestrate the combination sensitivity property exhibited by HVCX, as well as replicating in vivo firing patterns observed when TCS neurons are presented with various auditory stimuli. The ionic and synaptic currents for each class of neurons that are presented in our networks and are based on pharmacological studies, rendering our networks biologically plausible. We present for the first time several realistic scenarios in which the different types of HVC neurons can interact to produce this behavior. The different networks highlight neural mechanisms that could potentially help to explain some aspects of combination sensitivity, including 1) interplay between inhibitory interneurons’ activity and the post inhibitory firing of the HVCX neurons enabled by T-type Ca2+ and H currents, 2) temporal summation of synaptic inputs at the TCS site of opposing signals that are time-and frequency- dependent, and 3) reciprocal inhibitory and excitatory loops as a potent mechanism to encode information over many milliseconds. The result is a plausible network model characterizing auditory processing in HVC. Our next step is to test the predictions of the model.Keywords: combination sensitivity, songbirds, neural networks, spatiotemporal integration
Procedia PDF Downloads 652727 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain
Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper
Abstract:
Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.Keywords: additive manufacturing, lean production, reproducibility, work safety
Procedia PDF Downloads 1842726 The Role of Nutrition and Food Engineering in Promoting Sustainable Food Systems
Authors: Sara Khan Mohammadi
Abstract:
The world is facing a major challenge of feeding a growing population while ensuring the sustainability of food systems. The United Nations estimates that the global population will reach 9.7 billion by 2050, which means that food production needs to increase by 70% to meet the demand. However, this increase in food production should not come at the cost of environmental degradation, loss of biodiversity, and climate change. Therefore, there is a need for sustainable food systems that can provide healthy and nutritious food while minimizing their impact on the environment. Nutrition and Food Engineering: Nutrition and food engineering play a crucial role in promoting sustainable food system. Nutrition is concerned with the study of nutrients in foods, their absorption, metabolism, and their effects on health. Food engineering involves the application of engineering principles to design, develop, and optimize food processing operations. Together, nutrition and food engineering can help to create sustainable food systems by: 1. Developing Nutritious Foods: Nutritionists and food engineers can work together to develop foods that are rich in nutrients such as vitamins, minerals, fiber, and protein. These foods can be designed to meet the nutritional needs of different populations while minimizing waste. 2. Reducing Food Waste: Food waste is a major problem globally as it contributes to greenhouse gas emissions and wastes resources such as water and land. Nutritionists and food engineers can work together to develop technologies that reduce waste during processing, storage, transportation, and consumption. 3. Improving Food Safety: Unsafe foods can cause illnesses such as diarrhea, cholera, typhoid fever among others which are major public health concerns globally. Nutritionists and food engineers can work together to develop technologies that improve the safety of foods from farm to fork. 4. Enhancing Sustainability: Sustainable agriculture practices such as conservation agriculture can help reduce soil erosion while improving soil fertility. Nutritionists and food engineers can work together to develop technologies that promote sustainable agriculture practices.Keywords: sustainable food, developing food, reducing food waste, food safety
Procedia PDF Downloads 862725 Investigation of Processing Conditions on Rheological Features of Emulsion Gels and Oleogels Stabilized by Biopolymers
Authors: M. Sarraf, J. E. Moros, M. C. Sánchez
Abstract:
Oleogels are self-standing systems that are able to trap edible liquid oil into a tridimensional network and also help to use less fat by forming crystallization oleogelators. There are different ways to generate oleogelation and oil structuring, including direct dispersion, structured biphasic systems, oil sorption, and indirect method (emulsion-template). The selection of processing conditions as well as the composition of the oleogels is essential to obtain a stable oleogel with characteristics suitable for its purpose. In this sense, one of the ingredients widely used in food products to produce oleogels and emulsions is polysaccharides. Basil seed gum (BSG), with the scientific name Ocimum basilicum, is a new native polysaccharide with high viscosity and pseudoplastic behavior because of its high molecular weight in the food industry. Also, proteins can stabilize oil in water due to the presence of amino and carboxyl moieties that result in surface activity. Whey proteins are widely used in the food industry due to available, cheap ingredients, nutritional and functional characteristics such as emulsifier and a gelling agent, thickening, and water-binding capacity. In general, the interaction of protein and polysaccharides has a significant effect on the food structures and their stability, like the texture of dairy products, by controlling the interactions in macromolecular systems. Using edible oleogels as oil structuring helps for targeted delivery of a component trapped in a structural network. Therefore, the development of efficient oleogel is essential in the food industry. A complete understanding of the important points, such as the ratio oil phase, processing conditions, and concentrations of biopolymers that affect the formation and stability of the emulsion, can result in crucial information in the production of a suitable oleogel. In this research, the effects of oil concentration and pressure used in the manufacture of the emulsion prior to obtaining the oleogel have been evaluated through the analysis of droplet size and rheological properties of obtained emulsions and oleogels. The results show that the emulsion prepared in the high-pressure homogenizer (HPH) at higher pressure values has smaller droplet sizes and a higher uniformity in the size distribution curve. On the other hand, in relation to the rheological characteristics of the emulsions and oleogels obtained, the predominantly elastic character of the systems must be noted, as they present values of the storage modulus higher than those of losses, also showing an important plateau zone, typical of structured systems. In the same way, if steady-state viscous flow tests have been analyzed on both emulsions and oleogels, the result is that, once again, the pressure used in the homogenizer is an important factor for obtaining emulsions with adequate droplet size and the subsequent oleogel. Thus, various routes for trapping oil inside a biopolymer matrix with adjustable mechanical properties could be applied for the creation of the three-dimensional network in order to the oil absorption and creating oleogel.Keywords: basil seed gum, particle size, viscoelastic properties, whey protein
Procedia PDF Downloads 662724 Material Concepts and Processing Methods for Electrical Insulation
Authors: R. Sekula
Abstract:
Epoxy composites are broadly used as an electrical insulation for the high voltage applications since only such materials can fulfill particular mechanical, thermal, and dielectric requirements. However, properties of the final product are strongly dependent on proper manufacturing process with minimized material failures, as too large shrinkage, voids and cracks. Therefore, application of proper materials (epoxy, hardener, and filler) and process parameters (mold temperature, filling time, filling velocity, initial temperature of internal parts, gelation time), as well as design and geometric parameters are essential features for final quality of the produced components. In this paper, an approach for three-dimensional modeling of all molding stages, namely filling, curing and post-curing is presented. The reactive molding simulation tool is based on a commercial CFD package, and include dedicated models describing viscosity and reaction kinetics that have been successfully implemented to simulate the reactive nature of the system with exothermic effect. Also a dedicated simulation procedure for stress and shrinkage calculations, as well as simulation results are presented in the paper. Second part of the paper is dedicated to recent developments on formulations of functional composites for electrical insulation applications, focusing on thermally conductive materials. Concepts based on filler modifications for epoxy electrical composites have been presented, including the results of the obtained properties. Finally, having in mind tough environmental regulations, in addition to current process and design aspects, an approach for product re-design has been presented focusing on replacement of epoxy material with the thermoplastic one. Such “design-for-recycling” method is one of new directions associated with development of new material and processing concepts of electrical products and brings a lot of additional research challenges. For that, one of the successful products has been presented to illustrate the presented methodology.Keywords: curing, epoxy insulation, numerical simulations, recycling
Procedia PDF Downloads 2782723 An Overview of the SIAFIM Connected Resources
Authors: Tiberiu Boros, Angela Ionita, Maria Visan
Abstract:
Wildfires are one of the frequent and uncontrollable phenomena that currently affect large areas of the world where the climate, geographic and social conditions make it impossible to prevent and control such events. In this paper we introduce the ground concepts that lie behind the SIAFIM (Satellite Image Analysis for Fire Monitoring) project in order to create a context and we introduce a set of newly created tools that are external to the project but inherently in interventions and complex decision making based on geospatial information and spatial data infrastructures.Keywords: wildfire, forest fire, natural language processing, mobile applications, communication, GPS
Procedia PDF Downloads 5812722 Assessment of Environmental Mercury Contamination from an Old Mercury Processing Plant 'Thor Chemicals' in Cato Ridge, KwaZulu-Natal, South Africa
Authors: Yohana Fessehazion
Abstract:
Mercury is a prominent example of a heavy metal contaminant in the environment, and it has been extensively investigated for its potential health risk in humans and other organisms. In South Africa, massive mercury contamination happened in1980s when the England-based mercury reclamation processing plant relocated to Cato Ridge, KwaZulu-Natal Province, and discharged mercury waste into the Mngceweni River. This mercury waste discharge resulted in high mercury concentration that exceeded the acceptable levels in Mngceweni River, Umgeni River, and human hair of the nearby villagers. This environmental issue raised the alarm, and over the years, several environmental assessments were reported the dire environmental crises resulting from the Thor Chemicals (now known as Metallica Chemicals) and urged the immediate removal of the around 3,000 tons of mercury waste stored in the factory storage facility over two decades. Recently theft of some containers with the toxic substance from the Thor Chemicals warehouse and the subsequent fire that ravaged the facility furtherly put the factory on the spot escalating the urgency of left behind deadly mercury waste removal. This project aims to investigate the mercury contamination leaking from an old Thor Chemicals mercury processing plant. The focus will be on sediments, water, terrestrial plants, and aquatic weeds such as the prominent water hyacinth weeds in the nearby water systems of Mngceweni River, Umgeni River, and Inanda Dam as a bio-indicator and phytoremediator for mercury pollution. Samples will be collected in spring around October when the condition is favourable for microbial activity to methylate mercury incorporated in sediments and blooming season for some aquatic weeds, particularly water hyacinth. Samples of soil, sediment, water, terrestrial plant, and aquatic weed will be collected per sample site from the point of source (Thor Chemicals), Mngceweni River, Umgeni River, and the Inanda Dam. One-way analysis of variance (ANOVA) tests will be conducted to determine any significant differences in the Hg concentration among all sampling sites, followed by Least Significant Difference post hoc test to determine if mercury contamination varies with the gradient distance from the source point of pollution. The flow injection atomic spectrometry (FIAS) analysis will also be used to compare the mercury sequestration between the different plant tissues (roots and stems). The principal component analysis is also envisaged for use to determine the relationship between the source of mercury pollution and any of the sampling points (Umgeni and Mngceweni Rivers and the Inanda Dam). All the Hg values will be expressed in µg/L or µg/g in order to compare the result with the previous studies and regulatory standards. Sediments are expected to have relatively higher levels of Hg compared to the soils, and aquatic macrophytes, water hyacinth weeds are expected to accumulate a higher concentration of mercury than terrestrial plants and crops.Keywords: mercury, phytoremediation, Thor chemicals, water hyacinth
Procedia PDF Downloads 2232721 Algorithm for Improved Tree Counting and Detection through Adaptive Machine Learning Approach with the Integration of Watershed Transformation and Local Maxima Analysis
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
The Philippines is long considered as a valuable producer of high value crops globally. The country’s employment and economy have been dependent on agriculture, thus increasing its demand for the efficient agricultural mechanism. Remote sensing and geographic information technology have proven to effectively provide applications for precision agriculture through image-processing technique considering the development of the aerial scanning technology in the country. Accurate information concerning the spatial correlation within the field is very important for precision farming of high value crops, especially. The availability of height information and high spatial resolution images obtained from aerial scanning together with the development of new image analysis methods are offering relevant influence to precision agriculture techniques and applications. In this study, an algorithm was developed and implemented to detect and count high value crops simultaneously through adaptive scaling of support vector machine (SVM) algorithm subjected to object-oriented approach combining watershed transformation and local maxima filter in enhancing tree counting and detection. The methodology is compared to cutting-edge template matching algorithm procedures to demonstrate its effectiveness on a demanding tree is counting recognition and delineation problem. Since common data and image processing techniques are utilized, thus can be easily implemented in production processes to cover large agricultural areas. The algorithm is tested on high value crops like Palm, Mango and Coconut located in Misamis Oriental, Philippines - showing a good performance in particular for young adult and adult trees, significantly 90% above. The s inventories or database updating, allowing for the reduction of field work and manual interpretation tasks.Keywords: high value crop, LiDAR, OBIA, precision agriculture
Procedia PDF Downloads 4022720 The Design and Development of Online Infertility Prevention Education in the Frame of Mayer's Multimedia Learning Theory
Authors: B. Baran, S. N. Kaptanoglu, M. Ocal, Y. Kagnici, E. Esen, E. Siyez, D. M. Siyez
Abstract:
Infertility is the fact that couples cannot have children despite 1 year of unprotected sexual life. Infertility can be considered as an important problem affecting not only sexual life but also social and psychological conditions of couples. Learning about information about preventable factors related to infertility during university years plays an important role in preventing a possible infertility case in older ages. The possibility to facilitate access to information with the internet has provided the opportunity to reach a broad audience in the diverse learning environments and educational environment. Moreover, the internet has become a basic resource for the 21st-century learners. Providing information about infertility over the internet will enable more people to reach in a short time. When studies conducted abroad about infertility are examined, interactive websites and online education programs come to the fore. In Turkey, while there is no comprehensive online education program for university students, it seems that existing studies are aimed to make more advertisements for doctors or hospitals. In this study, it was aimed to design and develop online infertility prevention education for university students. Mayer’s Multimedia Learning Theory made up the framework for the online learning environment in this study. The results of the needs analysis collected from the university students in Turkey who were selected with sampling to represent the audience for online learning contributed to the design phase. In this study, an infertility prevention online education environment designed as a 4-week education was developed by explaining the theoretical basis and needs analysis results. As a result; in the development of the online environment, different kind of visual aids that will increase teaching were used in the environment of online education according to Mayer’s principles of extraneous processing (coherence, signaling, spatial contiguity, temporal contiguity, redundancy, expectation principles), essential processing (segmenting, pre-training, modality principles) and generative processing (multimedia, personalization, voice principles). For example, the important points in reproductive systems’ expression were emphasized by visuals in order to draw learners’ attention, and the presentation of the information was also supported by the human voice. In addition, because of the limited knowledge of university students in the subject, the issue of female reproductive and male reproductive systems was taught before preventable factors related to infertility. Furthermore, 3D video and augmented reality application were developed in order to embody female and male reproductive systems. In conclusion, this study aims to develop an interactive Online Infertility Prevention Education in which university students can easily access reliable information and evaluate their own level of knowledge about the subject. It is believed that the study will also guide the researchers who want to develop online education in this area as it contains design-stage decisions of interactive online infertility prevention education for university students.Keywords: infertility, multimedia learning theory, online education, reproductive health
Procedia PDF Downloads 1702719 Emotion Recognition in Video and Images in the Wild
Authors: Faizan Tariq, Moayid Ali Zaidi
Abstract:
Facial emotion recognition algorithms are expanding rapidly now a day. People are using different algorithms with different combinations to generate best results. There are six basic emotions which are being studied in this area. Author tried to recognize the facial expressions using object detector algorithms instead of traditional algorithms. Two object detection algorithms were chosen which are Faster R-CNN and YOLO. For pre-processing we used image rotation and batch normalization. The dataset I have chosen for the experiments is Static Facial Expression in Wild (SFEW). Our approach worked well but there is still a lot of room to improve it, which will be a future direction.Keywords: face recognition, emotion recognition, deep learning, CNN
Procedia PDF Downloads 1872718 Digital Revolution a Veritable Infrastructure for Technological Development
Authors: Osakwe Jude Odiakaosa
Abstract:
Today’s digital society is characterized by e-education or e-learning, e-commerce, and so on. All these have been propelled by digital revolution. Digital technology such as computer technology, Global Positioning System (GPS) and Geographic Information System (GIS) has been having a tremendous impact on the field of technology. This development has positively affected the scope, methods, speed of data acquisition, data management and the rate of delivery of the results (map and other map products) of data processing. This paper tries to address the impact of revolution brought by digital technology.Keywords: digital revolution, internet, technology, data management
Procedia PDF Downloads 4492717 Distributive School Leadership in Croatian Primary Schools
Authors: Iva Buchberger, Vesna Kovač
Abstract:
Global education policy trends and recommendations underline the importance of (distributive) school leadership as a school effectiveness key factor. In this context, the broader aim of this research (supported by the Croatian Science Foundation) is to identify school leadership characteristics in Croatian schools and to examine the correlation between school leadership and school effectiveness. The aim of the proposed conference paper is to focus on the school leadership characteristics which are additionally explained with school leadership facilitators that contribute to (distributive) school leadership development. The aforementioned school leadership characteristics include the following dimensions: (a) participation in the process of making different types of decisions, (b) influence in the decision making process, (c) social interactions between different stakeholders in the decision making process in schools. Further, the school leadership facilitators are categorized as follows: (a) principal’s activities (such as providing support to different stakeholders and developing mutual trust among them), (b) stakeholders’ characteristics (such as developed stakeholders’ interest and competence to participate in decision-making process), (c) organizational and material resources (such as school material conditions, the necessary information and time as resources for making decisions). The data were collected by a constructed and validated questionnaire for examining the school leadership characteristics and facilitators from teachers’ perspective. The main population in this study consists of all primary schools in Croatia while the sample is comprised of 100 primary schools, selected by random sampling. Furthermore, the sample of teachers was selected by an additional procedure taking into consideration the independent variables of sex, work experience, etc. Data processing was performed by standard statistical methods of descriptive and inferential statistics. Statistical program IBM SPSS 20.0 was used for data processing. The results of this study show that there is a (positive) correlation between school leadership characteristics and school leadership facilitators. Specifically, it is noteworthy to mention that all the dimensions of school leadership characteristics are in positive correlation with the categories of school leadership facilitators. These results are indicative for the education policy creators who should ensure positive and supportive environment for the school leadership development including the development of school leadership characteristics and school leadership facilitators.Keywords: distributive school leadership, school effectiveness , school leadership characteristics, school leadership facilitators
Procedia PDF Downloads 2482716 Leveraging Natural Language Processing for Legal Artificial Intelligence: A Longformer Approach for Taiwanese Legal Cases
Abstract:
Legal artificial intelligence (LegalAI) has been increasing applications within legal systems, propelled by advancements in natural language processing (NLP). Compared with general documents, legal case documents are typically long text sequences with intrinsic logical structures. Most existing language models have difficulty understanding the long-distance dependencies between different structures. Another unique challenge is that while the Judiciary of Taiwan has released legal judgments from various levels of courts over the years, there remains a significant obstacle in the lack of labeled datasets. This deficiency makes it difficult to train models with strong generalization capabilities, as well as accurately evaluate model performance. To date, models in Taiwan have yet to be specifically trained on judgment data. Given these challenges, this research proposes a Longformer-based pre-trained language model explicitly devised for retrieving similar judgments in Taiwanese legal documents. This model is trained on a self-constructed dataset, which this research has independently labeled to measure judgment similarities, thereby addressing a void left by the lack of an existing labeled dataset for Taiwanese judgments. This research adopts strategies such as early stopping and gradient clipping to prevent overfitting and manage gradient explosion, respectively, thereby enhancing the model's performance. The model in this research is evaluated using both the dataset and the Average Entropy of Offense-charged Clustering (AEOC) metric, which utilizes the notion of similar case scenarios within the same type of legal cases. Our experimental results illustrate our model's significant advancements in handling similarity comparisons within extensive legal judgments. By enabling more efficient retrieval and analysis of legal case documents, our model holds the potential to facilitate legal research, aid legal decision-making, and contribute to the further development of LegalAI in Taiwan.Keywords: legal artificial intelligence, computation and language, language model, Taiwanese legal cases
Procedia PDF Downloads 722715 Effects of Partial Sleep Deprivation on Prefrontal Cognitive Functions in Adolescents
Authors: Nurcihan Kiris
Abstract:
Restricted sleep is common in young adults and adolescents. The results of a few objective studies of sleep deprivation on cognitive performance were not clarified. In particular, the effect of sleep deprivation on the cognitive functions associated with frontal lobe such as attention, executive functions, working memory is not well known. The aim of this study is to investigate the effect of partial sleep deprivation experimentally in adolescents on the cognitive tasks of frontal lobe including working memory, strategic thinking, simple attention, continuous attention, executive functions, and cognitive flexibility. Subjects of the study were recruited from voluntary students of Cukurova University. Eighteen adolescents underwent four consecutive nights of monitored sleep restriction (6–6.5 hr/night) and four nights of sleep extension (10–10.5 hr/night), in counterbalanced order, and separated by a washout period. Following each sleep period, cognitive performance was assessed, at a fixed morning time, using a computerized neuropsychological battery based on frontal lobe functions task, a timed test providing both accuracy and reaction time outcome measures. Only the spatial working memory performance of cognitive tasks was found to be statistically lower in a restricted sleep condition than the extended sleep condition. On the other hand, there was no significant difference in the performance of cognitive tasks evaluating simple attention, constant attention, executive functions, and cognitive flexibility. It is thought that especially the spatial working memory and strategic thinking skills of adolescents may be susceptible to sleep deprivation. On the other hand, adolescents are predicted to be optimally successful in ideal sleep conditions, especially in the circumstances requiring for the short term storage of visual information, processing of stored information, and strategic thinking. The findings of this study may also be associated with possible negative functional effects on the processing of academic social and emotional inputs in adolescents for partial sleep deprivation. Acknowledgment: This research was supported by Cukurova University Scientific Research Projects Unit.Keywords: attention, cognitive functions, sleep deprivation, working memory
Procedia PDF Downloads 1552714 Yacht DB Construction Based on Five Essentials of Sailing
Authors: Jae-Neung Lee, Myung-Won Lee, Jung-Su Han, Keun-Chang Kwak
Abstract:
The paper established DB on the basis of five sailing essentials in the real yachting environment. It obtained the yacht condition (tilt, speed and course), surrounding circumstances (wind direction and speed) and user motion. Gopro camera for image processing was used to recognize the user motion and tilt sensor was employed to see the yacht balance. In addition, GPS for course, wind speed and direction sensor and marked suit were employed.Keywords: DB consturuction, yacht, five essentials of sailing, marker, Gps
Procedia PDF Downloads 4612713 Vascular Targeted Photodynamic Therapy Monitored by Real-Time Laser Speckle Imaging
Authors: Ruth Goldschmidt, Vyacheslav Kalchenko, Lilah Agemy, Rachel Elmoalem, Avigdor Scherz
Abstract:
Vascular Targeted Photodynamic therapy (VTP) is a new modality for selective cancer treatment that leads to the complete tumor ablation. A photosensitizer, a bacteriochlorophyll derivative in our case, is first administered to the patient and followed by the illumination of the tumor area, by a near-IR laser for its photoactivation. The photoactivated drug releases reactive oxygen species (ROS) in the circulation, which reacts with blood cells and the endothelium leading to the occlusion of the blood vasculature. If the blood vessels are only partially closed, the tumor may recover, and cancer cells could survive. On the other hand, excessive treatment may lead to toxicity of healthy tissues nearby. Simultaneous VTP monitoring and image processing independent of the photoexcitation laser has not yet been reported, to our knowledge. Here we present a method for blood flow monitoring, using a real-time laser speckle imaging (RTLSI) in the tumor during VTP. We have synthesized over the years a library of bacteriochlorophyll derivatives, among them WST11 and STL-6014. Both are water soluble derivatives that are retained in the blood vasculature through their partial binding to HSA. WST11 has been approved in Mexico for VTP treatment of prostate cancer at a certain drug dose, and time/intensity of illumination. Application to other bacteriochlorophyll derivatives or other cancers may require different treatment parameters (such as light/drug administration). VTP parameters for STL-6014 are still under study. This new derivative mainly differs from WST11 by its lack of the central Palladium, and its conjugation to an Arg-Gly-Asp (RGD) sequence. RGD is a tumor-specific ligand that is used for targeting the necrotic tumor domains through its affinity to αVβ3 integrin receptors. This enables the study of cell-targeted VTP. We developed a special RTLSI module, based on Labview software environment for data processing. The new module enables to acquire raw laser speckle images and calculate the values of the laser temporal statistics of time-integrated speckles in real time, without additional off-line processing. Using RTLSI, we could monitor the tumor’s blood flow following VTP in a CT26 colon carcinoma ear model. VTP with WST11 induced an immediate slow down of the blood flow within the tumor and a complete final flow arrest, after some sporadic reperfusions. If the irradiation continued further, the blood flow stopped also in the blood vessels of the surrounding healthy tissue. This emphasizes the significance of light dose control. Using our RTLSI system, we could prevent any additional healthy tissue damage by controlling the illumination time and restrict blood flow arrest within the tumor only. In addition, we found that VTP with STL-6014 was the most effective when the photoactivation was conducted 4h post-injection, in terms of tumor ablation success in-vivo and blood vessel flow arrest. In conclusion, RTSLI application should allow to optimize VTP efficacy vs. toxicity in both the preclinical and clinical arenas.Keywords: blood vessel occlusion, cancer treatment, photodynamic therapy, real time imaging
Procedia PDF Downloads 2232712 Investigating the Editing's Effect of Advertising Photos on the Virtual Purchase Decision Based on the Quantitative Electroencephalogram (EEG) Parameters
Authors: Parya Tabei, Maryam Habibifar
Abstract:
Decision-making is an important cognitive function that can be defined as the process of choosing an option among available options to achieve a specific goal. Consumer ‘need’ is the main reason for purchasing decisions. Human decision-making while buying products online is subject to various factors, one of which is the quality and effect of advertising photos. Advertising photo editing can have a significant impact on people's virtual purchase decisions. This technique helps improve the quality and overall appearance of photos by adjusting various aspects such as brightness, contrast, colors, cropping, resizing, and adding filters. This study, by examining the effect of editing advertising photos on the virtual purchase decision using EEG data, tries to investigate the effect of edited images on the decision-making of customers. A group of 30 participants were asked to react to 24 edited and unedited images while their EEG was recorded. Analysis of the EEG data revealed increased alpha wave activity in the occipital regions (O1, O2) for both edited and unedited images, which is related to visual processing and attention. Additionally, there was an increase in beta wave activity in the frontal regions (FP1, FP2, F4, F8) when participants viewed edited images, suggesting involvement in cognitive processes such as decision-making and evaluating advertising content. Gamma wave activity also increased in various regions, especially the frontal and parietal regions, which are associated with higher cognitive functions, such as attention, memory, and perception, when viewing the edited images. While the visual processing reflected by alpha waves remained consistent across different visual conditions, editing advertising photos appeared to boost neural activity in frontal and parietal regions associated with decision-making processes. These Findings suggest that photo editing could potentially influence consumer perceptions during virtual shopping experiences by modulating brain activity related to product assessment and purchase decisions.Keywords: virtual purchase decision, advertising photo, EEG parameters, decision Making
Procedia PDF Downloads 492711 Mg AZ31B Alloy Processed through ECASD
Authors: P. Fernández-Morales, D. Peláez, C. Isaza, J. M. Meza, E. Mendoza
Abstract:
Mg AZ31B alloy sheets were processed through equal-channel angular sheet drawing (ECASD) process, following the route A and C at room temperature and varying the processing speed. SEM was used to analyze the microstructure. The grain size was refined and presence of twins was observed. Vickers microhardness and tensile testing were carried out to evaluate the mechanical properties, showing in general; a remarkable increase in the first pass and slight increases during subsequent passes and, that the route C produces better uniform properties distribution through the thickness of the samples.Keywords: ECASD, Mg Alloy, mechanical properties, microstructure
Procedia PDF Downloads 3632710 Mechanical Properties of Poly(Propylene)-Based Graphene Nanocomposites
Authors: Luiza Melo De Lima, Tito Trindade, Jose M. Oliveira
Abstract:
The development of thermoplastic-based graphene nanocomposites has been of great interest not only to the scientific community but also to different industrial sectors. Due to the possible improvement of performance and weight reduction, thermoplastic nanocomposites are a great promise as a new class of materials. These nanocomposites are of relevance for the automotive industry, namely because the emission limits of CO2 emissions imposed by the European Commission (EC) regulations can be fulfilled without compromising the car’s performance but by reducing its weight. Thermoplastic polymers have some advantages over thermosetting polymers such as higher productivity, lower density, and recyclability. In the automotive industry, for example, poly(propylene) (PP) is a common thermoplastic polymer, which represents more than half of the polymeric raw material used in automotive parts. Graphene-based materials (GBM) are potential nanofillers that can improve the properties of polymer matrices at very low loading. In comparison to other composites, such as fiber-based composites, weight reduction can positively affect their processing and future applications. However, the properties and performance of GBM/polymer nanocomposites depend on the type of GBM and polymer matrix, the degree of dispersion, and especially the type of interactions between the fillers and the polymer matrix. In order to take advantage of the superior mechanical strength of GBM, strong interfacial strength between GBM and the polymer matrix is required for efficient stress transfer from GBM to the polymer. Thus, chemical compatibilizers and physicochemical modifications have been reported as important tools during the processing of these nanocomposites. In this study, PP-based nanocomposites were obtained by a simple melt blending technique, using a Brabender type mixer machine. Graphene nanoplatelets (GnPs) were applied as structural reinforcement. Two compatibilizers were used to improve the interaction between PP matrix and GnPs: PP graft maleic anhydride (PPgMA) and PPgMA modified with tertiary amine alcohol (PPgDM). The samples for tensile and Charpy impact tests were obtained by injection molding. The results suggested the GnPs presence can increase the mechanical strength of the polymer. However, it was verified that the GnPs presence can promote a decrease of impact resistance, turning the nanocomposites more fragile than neat PP. The compatibilizers’ incorporation increases the impact resistance, suggesting that the compatibilizers can enhance the adhesion between PP and GnPs. Compared to neat PP, Young’s modulus of non-compatibilized nanocomposite increase demonstrated that GnPs incorporation can promote a stiffness improvement of the polymer. This trend can be related to the several physical crosslinking points between the PP matrix and the GnPs. Furthermore, the decrease of strain at a yield of PP/GnPs, together with the enhancement of Young’s modulus, confirms that the GnPs incorporation led to an increase in stiffness but to a decrease in toughness. Moreover, the results demonstrated that incorporation of compatibilizers did not affect Young’s modulus and strain at yield results compared to non-compatibilized nanocomposite. The incorporation of these compatibilizers showed an improvement of nanocomposites’ mechanical properties compared both to those the non-compatibilized nanocomposite and to a PP sample used as reference.Keywords: graphene nanoplatelets, mechanical properties, melt blending processing, poly(propylene)-based nanocomposites
Procedia PDF Downloads 1872709 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication
Procedia PDF Downloads 2902708 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia
Authors: Rohan Bhasin
Abstract:
Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM
Procedia PDF Downloads 1642707 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults
Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura
Abstract:
The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing
Procedia PDF Downloads 2832706 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 1472705 Technical and Practical Aspects of Sizing a Autonomous PV System
Authors: Abdelhak Bouchakour, Mustafa Brahami, Layachi Zaghba
Abstract:
The use of photovoltaic energy offers an inexhaustible supply of energy but also a clean and non-polluting energy, which is a definite advantage. The geographical location of Algeria promotes the development of the use of this energy. Indeed, given the importance of the intensity of the radiation received and the duration of sunshine. For this reason, the objective of our work is to develop a data-processing tool (software) of calculation and optimization of dimensioning of the photovoltaic installations. Our approach of optimization is basing on mathematical models, which amongst other things describe the operation of each part of the installation, the energy production, the storage and the consumption of energy.Keywords: solar panel, solar radiation, inverter, optimization
Procedia PDF Downloads 6082704 Creating Energy Sustainability in an Enterprise
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure
Procedia PDF Downloads 1112703 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 3502702 Occupational Heat Stress Condition According to Wet Bulb Globe Temperature Index in Textile Processing Unit: A Case Study of Surat, Gujarat, India
Authors: Dharmendra Jariwala, Robin Christian
Abstract:
Thermal exposure is a common problem in every manufacturing industry where heat is used in the manufacturing process. In developing countries like India, a lack of awareness regarding the proper work environmental condition is observed among workers. Improper planning of factory building, arrangement of machineries, ventilation system, etc. play a vital role in the rise of temperature within the manufacturing areas. Due to the uncontrolled thermal stress, workers may be subjected to various heat illnesses from mild disorder to heat stroke. Heat stress is responsible for the health risk and reduction in production. Wet Bulb Globe Temperature (WBGT) index and relative humidity are used to evaluate heat stress conditions. WBGT index is a weighted average of natural wet bulb temperature, globe temperature, dry bulb temperature, which are measured with standard instrument QuestTemp 36 area stress monitor. In this study textile processing units have been selected in the industrial estate in the Surat city. Based on the manufacturing process six locations were identified within the plant at which process was undertaken at 120°C to 180°C. These locations were jet dying machine area, stenter machine area, printing machine, looping machine area, washing area which generate process heat. Office area was also selected for comparision purpose as a sixth location. Present Study was conducted in the winter season and summer season for day and night shift. The results shows that average WBGT index was found above Threshold Limiting Value (TLV) during summer season for day and night shift in all three industries except office area. During summer season highest WBGT index of 32.8°C was found during day shift and 31.5°C was found during night shift at printing machine area. Also during winter season highest WBGT index of 30°C and 29.5°C was found at printing machine area during day shift and night shift respectively.Keywords: relative humidity, textile industry, thermal stress, WBGT
Procedia PDF Downloads 173