Search results for: images processing
1745 Analysis of Radiation-Induced Liver Disease (RILD) and Evaluation of Relationship between Therapeutic Activity and Liver Clearance Rate with Tc-99m-Mebrofenin in Yttrium-90 Microspheres Treatment
Authors: H. Tanyildizi, M. Abuqebitah, I. Cavdar, M. Demir, L. Kabasakal
Abstract:
Aim: Whole liver radiation has the modest benefit in the treatment of unresectable hepatic metastases but the radiation doses must keep in control. Otherwise, RILD complications may arise. In this study, we aimed to calculate amount of maximum permissible activity (MPA) and critical organ absorbed doses with MIRD methodology, to evaluate tumour doses for treatment response and whole liver doses for RILD and to find optimal liver function test additionally. Materials and Methods: This study includes 29 patients who attended our nuclear medicine department suffering from Y-90 microspheres treatment. 10 mCi Tc-99m MAA was applied to the patients for dosimetry via IV. After the injection, whole body SPECT/CT images were taken in one hour. The minimum therapeutic tumour dose is on the point of being 120 Gy1, the amount of activities were calculated with MIRD methodology considering volumetric tumour/liver rate. A sub-working group was created with 11 patients randomly and liver clearance rate with Tc-99m-Mebrofenin was calculated according to Ekman formalism. Results: The volumetric tumour/liver rates were found between 33-66% (Maksimum Tolarable Dose (MTD) 48-52Gy3) for 4 patients, were found less than 33% (MTD 72Gy3) for 25 patients. According to these results the average amount of activity, mean liver dose and mean tumour dose were found 1793.9±1.46 MBq, 32.86±0.19 Gy, and 138.26±0.40 Gy. RILD was not observed in any patient. In sub-working group, the relationship between Bilirubin, Albumin, INR (which show presence of liver disease and its degree), liver clearance with Tc-99m-Mebrofenin and calculated activity amounts were found r=0.49, r=0.27, r=0.43, r=0.57, respectively. Discussions: The minimum tumour dose was found 120 Gy for positive dose-response relation. If volumetric tumour/liver rate was > 66%, dose 30 Gy; if volumetric tumour/liver rate 33-66%, dose escalation 48 Gy; if volumetric tumour/liver rate < 33%, dose 72 Gy. These dose limitations did not create RILD. Clearance measurement with Mebrofenin was concluded that the best method to determine the liver function. Therefore, liver clearance rate with Tc-99m-Mebrofenin should be considered in calculation of yttrium-90 microspheres dosimetry.Keywords: clearance, dosimetry, liver, RILD
Procedia PDF Downloads 4401744 Investigation of Supercapacitor Properties of Nanocomposites Obtained from Acid and Base-functionalized Multi-walled Carbon Nanotube (MWCNT) and Polypyrrole (PPy)
Authors: Feridun Demir, Pelin Okdem
Abstract:
Polymers are versatile materials with many unique properties, such as low density, reasonable strength, flexibility, and easy processability. However, the mechanical properties of these materials are insufficient for many engineering applications. Therefore, there is a continuous search for new polymeric materials with improved properties. Polymeric nanocomposites are an advanced class of composite materials that have attracted great attention in both academic and industrial fields. Since nano-reinforcement materials are very small in size, they provide ultra-large interfacial area per volume between the nano-element and the polymer matrix. This allows the nano-reinforcement composites to exhibit enhanced toughness without compromising hardness or optical clarity. PPy and MWCNT/PPy nanocomposites were synthesized by the chemical oxidative polymerization method and the supercapacitor properties of the obtained nanocomposites were investigated. In addition, pure MWCNT was functionalized with acid (H₂SO₄/H₂O₂) and base (NH₄OH/H₂O₂) solutions at a ratio of 3:1 and a-MWCNT/d-PPy, and b-MWCNT/d-PPy nanocomposites were obtained. The homogeneous distribution of MWCNTs in the polypyrrole matrix and shell-core type morphological structures of the nanocomposites was observed with SEM images. It was observed with SEM, FTIR and XRD analyses that the functional groups formed by the functionalization of MWCNTs caused the MWCNTs to come together and partially agglomerate. It was found that the conductivity of the nanocomposites consisting of MWCNT and d-PPy was higher than that of pure d-PPy. CV, GCD and EIS results show that the use of a-MWCNT and b-MWCNTs in nanocomposites with low particle content positively affects the supercapacitor properties of the materials but negatively at high particle content. It was revealed that the functional MWCNT particles combined in nanocomposites with high particle content cause a decrease in the conductivity and distribution of ions in the electrodes and, thus, a decrease in their energy storage capacity.Keywords: polypyrrole, multi-walled carbon nanotube (MWCNT), conducting polymer, chemical oxidative polymerization, nanocomposite, supercapacitor
Procedia PDF Downloads 211743 Identification and Classification of Medicinal Plants of Indian Himalayan Region Using Hyperspectral Remote Sensing and Machine Learning Techniques
Authors: Kishor Chandra Kandpal, Amit Kumar
Abstract:
The Indian Himalaya region harbours approximately 1748 plants of medicinal importance, and as per International Union for Conservation of Nature (IUCN), the 112 plant species among these are threatened and endangered. To ease the pressure on these plants, the government of India is encouraging its in-situ cultivation. The Saussurea costus, Valeriana jatamansi, and Picrorhiza kurroa have also been prioritized for large scale cultivation owing to their market demand, conservation value and medicinal properties. These species are found from 1000 m to 4000 m elevation ranges in the Indian Himalaya. Identification of these plants in the field requires taxonomic skills, which is one of the major bottleneck in the conservation and management of these plants. In recent years, Hyperspectral remote sensing techniques have been precisely used for the discrimination of plant species with the help of their unique spectral signatures. In this background, a spectral library of the above 03 medicinal plants was prepared by collecting the spectral data using a handheld spectroradiometer (325 to 1075 nm) from farmer’s fields of Himachal Pradesh and Uttarakhand states of Indian Himalaya. The Random forest (RF) model was implied on the spectral data for the classification of the medicinal plants. The 80:20 standard split ratio was followed for training and validation of the RF model, which resulted in training accuracy of 84.39 % (kappa coefficient = 0.72) and testing accuracy of 85.29 % (kappa coefficient = 0.77). This RF classifier has identified green (555 to 598 nm), red (605 nm), and near-infrared (725 to 840 nm) wavelength regions suitable for the discrimination of these species. The findings of this study have provided a technique for rapid and onsite identification of the above medicinal plants in the field. This will also be a key input for the classification of hyperspectral remote sensing images for mapping of these species in farmer’s field on a regional scale. This is a pioneer study in the Indian Himalaya region for medicinal plants in which the applicability of hyperspectral remote sensing has been explored.Keywords: himalaya, hyperspectral remote sensing, machine learning; medicinal plants, random forests
Procedia PDF Downloads 2031742 Tropical Squall Lines in Brazil: A Methodology for Identification and Analysis Based on ISCCP Tracking Database
Authors: W. A. Gonçalves, E. P. Souza, C. R. Alcântara
Abstract:
The ISCCP-Tracking database offers an opportunity to study physical and morphological characteristics of Convective Systems based on geostationary meteorological satellites. This database contains 26 years of tracking of Convective Systems for the entire globe. Then, Tropical Squall Lines which occur in Brazil are certainly within the database. In this study, we propose a methodology for identification of these systems based on the ISCCP-Tracking database. A physical and morphological characterization of these systems is also shown. The proposed methodology is firstly based on the year of 2007. The Squall Lines were subjectively identified by visually analyzing infrared images from GOES-12. Based on this identification, the same systems were identified within the ISCCP-Tracking database. It is known, and it was also observed that the Squall Lines which occur on the north coast of Brazil develop parallel to the coast, influenced by the sea breeze. In addition, it was also observed that the eccentricity of the identified systems was greater than 0.7. Then, a methodology based on the inclination (based on the coast) and eccentricity (greater than 0.7) of the Convective Systems was applied in order to identify and characterize Tropical Squall Lines in Brazil. These thresholds were applied back in the ISCCP-Tracking database for the year of 2007. It was observed that other systems, which were not Squall Lines, were also identified. Then, we decided to call all systems identified by the inclination and eccentricity thresholds as Linear Convective Systems, instead of Squall Lines. After this step, the Linear Convective Systems were identified and characterized for the entire database, from 1983 to 2008. The physical and morphological characteristics of these systems were compared to those systems which did not have the required inclination and eccentricity to be called Linear Convective Systems. The results showed that the convection associated with the Linear Convective Systems seems to be more intense and organized than in the other systems. This affirmation is based on all ISCCP-Tracking variables analyzed. This type of methodology, which explores 26 years of satellite data by an objective analysis, was not previously explored in the literature. The physical and morphological characterization of the Linear Convective Systems based on 26 years of data is of a great importance and should be used in many branches of atmospheric sciences.Keywords: squall lines, convective systems, linear convective systems, ISCCP-Tracking
Procedia PDF Downloads 3011741 The Portrayal of Violence Against Women in Bangladesh News Media: Seeing It Through Rumana Manzur’s Case
Authors: Zerrin Akter Anni
Abstract:
The media's role in shaping perceptions of violence against women (VAW) and their portrayal in news reporting significantly influences our understanding of this critical issue. My research delves into the portrayal of violence against women in mainstream media, using the prominent case of Dr. Rumana Manzur, a former UBC Fulbright Scholar from Bangladesh who suffered a brutal assault by her ex-husband in June 2011. Employing a qualitative research approach, this study uses an ethnographic media analysis method to scrutinize news reports of the aforementioned case from selected newspapers in Bangladesh. The primary objectives are to investigate how the popular news media in Bangladesh addresses the issue of violence against women and frames the victims of such violence. The findings of this research highlight that news media can perpetuate gender stereotypes and subtly shift blame onto the victim through various techniques, creating intricate interactions between the reader and the text. These techniques include sensationalized headlines, textual content, and graphic images. This victim-blaming process not only retraumatizes the survivor but also distorts the actual facts when presenting the case to a larger audience. Consequently, the representation of violence against women cases in media, particularly the portrayal of women as victims during reporting, significantly impacts our collective comprehension of this issue. In conclusion, this paper asserts that the Bangladeshi media, particularly news outlets, in conjunction with society, continue to follow a pattern of depicting gender-based violence in ways that devalue the image of women. This research underscores the need for critical analysis of media representations of violence against women cases, as they can perpetuate harmful stereotypes and hinder efforts to combat this pervasive problem. Therefore, the outcome of this research is to comprehend the complex dynamics between media and violence against women, which is essential for fostering a more empathetic and informed society that actively works towards eradicating this problem from our society.Keywords: media representation, violence against women (vaw), ethnographic media analysis, victim-blaming, sensationalized headline
Procedia PDF Downloads 741740 General Time-Dependent Sequenced Route Queries in Road Networks
Authors: Mohammad Hossein Ahmadi, Vahid Haghighatdoost
Abstract:
Spatial databases have been an active area of research over years. In this paper, we study how to answer the General Time-Dependent Sequenced Route queries. Given the origin and destination of a user over a time-dependent road network graph, an ordered list of categories of interests and a departure time interval, our goal is to find the minimum travel time path along with the best departure time that minimizes the total travel time from the source location to the given destination passing through a sequence of points of interests belonging to each of the specified categories of interest. The challenge of this problem is the added complexity to the optimal sequenced route queries, where we assume that first the road network is time dependent, and secondly the user defines a departure time interval instead of one single departure time instance. For processing general time-dependent sequenced route queries, we propose two solutions as Discrete-Time and Continuous-Time Sequenced Route approaches, finding approximate and exact solutions, respectively. Our proposed approaches traverse the road network based on A*-search paradigm equipped with an efficient heuristic function, for shrinking the search space. Extensive experiments are conducted to verify the efficiency of our proposed approaches.Keywords: trip planning, time dependent, sequenced route query, road networks
Procedia PDF Downloads 3211739 Biotech Processes to Recover Valuable Fraction from Buffalo Whey Usable in Probiotic Growth, Cosmeceutical, Nutraceutical and Food Industries
Authors: Alberto Alfano, Sergio D’ambrosio, Darshankumar Parecha, Donatella Cimini, Chiara Schiraldi.
Abstract:
The main objective of this study regards the setup of an efficient small-scale platform for the conversion of local renewable waste materials, such as whey, into added-value products, thereby reducing environmental impact and costs deriving from the disposal of processing waste products. The buffalo milk whey derived from the cheese-making process, called second cheese whey, is the main by-product of the dairy industry. Whey is the main and most polluting by-product obtained from cheese manufacturing consisting of lactose, lactic acid, proteins, and salts, making whey an added-value product. In Italy, and in particular, in the Campania region, soft cheese production needs a large volume of liquid waste, especially during late spring and summer. This project is part of a circular economy perspective focused on the conversion of potentially polluting and difficult to purify waste into a resource to be exploited, and it embodies the concept of the three “R”: reduce, recycle, and reuse. Special focus was paid to the production of health-promoting biomolecules and biopolymers, which may be exploited in different segments of the food and pharmaceutical industries. These biomolecules may be recovered through appropriate processes and reused in an attempt to obtain added value products. So, ultrafiltration and nanofiltration processes were performed to fractionate bioactive components starting from buffalo milk whey. In this direction, the present study focused on the implementation of a downstream process that converts waste generated from food and food processing industries into added value products with potential applications. Owing to innovative downstream and biotechnological processes, rather than a waste product may be considered a resource to obtain high added value products, such as food supplements (probiotics), cosmeceuticals, biopolymers, and recyclable purified water. Besides targeting gastrointestinal disorders, probiotics such as Lactobacilli have been reported to improve immunomodulation and protection of the host against infections caused by viral and bacterial pathogens. Interestingly, also inactivated microbial (probiotic) cells and their metabolic products, indicated as parabiotic and postbiotics, respectively, have a crucial role and act as mediators in the modulation of the host’s immune function. To boost the production of biomass (both viable and/or heat inactivated cells) and/or the synthesis of growth-related postbiotics, such as EPS, efficient and sustainable fermentation processes are necessary. Based on a “zero-waste” approach, wastes generated from local industries can be recovered and recycled to develop sustainable biotechnological processes to obtain probiotics as well as post and parabiotic, to be tested as bioactive compounds against gastrointestinal disorders. The results have shown it was possible to recover an ultrafiltration retentate with suitable characteristics to be used in skin dehydration, to perform films (i.e., packaging for food industries), or as a wound repair agent and a nanofiltration retentate to recover lactic acid and carbon sources (e.g., lactose, glucose..) used for microbial cultivation. On the side, the last goal is to obtain purified water that can be reused throughout the process. In fact, water reclamation and reuse provide a unique and viable opportunity to augment traditional water supplies, a key issue nowadays.Keywords: biotech process, downstream process, probiotic growth, from waste to product, buffalo whey
Procedia PDF Downloads 691738 Information Disclosure And Financial Sentiment Index Using a Machine Learning Approach
Authors: Alev Atak
Abstract:
In this paper, we aim to create a financial sentiment index by investigating the company’s voluntary information disclosures. We retrieve structured content from BIST 100 companies’ financial reports for the period 1998-2018 and extract relevant financial information for sentiment analysis through Natural Language Processing. We measure strategy-related disclosures and their cross-sectional variation and classify report content into generic sections using synonym lists divided into four main categories according to their liquidity risk profile, risk positions, intra-annual information, and exposure to risk. We use Word Error Rate and Cosin Similarity for comparing and measuring text similarity and derivation in sets of texts. In addition to performing text extraction, we will provide a range of text analysis options, such as the readability metrics, word counts using pre-determined lists (e.g., forward-looking, uncertainty, tone, etc.), and comparison with reference corpus (word, parts of speech and semantic level). Therefore, we create an adequate analytical tool and a financial dictionary to depict the importance of granular financial disclosure for investors to identify correctly the risk-taking behavior and hence make the aggregated effects traceable.Keywords: financial sentiment, machine learning, information disclosure, risk
Procedia PDF Downloads 941737 Role of Speech Language Pathologists in Vocational Rehabilitation
Authors: Marlyn Mathew
Abstract:
Communication is the key factor in any vocational /job set-up. However many persons with disabilities suffer a deficit in this very area in terms of comprehension, expression and cognitive skills making it difficult for them to get employed appropriately or stay employed. Vocational Rehabilitation is a continuous and coordinated process which involves the provision of vocational related services designed to enable a person with disability to obtain and maintain employment. Therefore the role of the speech language pathologist is crucial in assessing the communication deficits and needs of the individual at the various phases of employment- right from the time of seeking a job and attending interview with suitable employers and also at regular intervals of the employment. This article discusses the various communication deficits and the obstacles faced by individuals with special needs including but not limited to cognitive- linguistic deficits, execution function deficits, speech and language processing difficulties and strategies that can be introduced in the workplace to overcome these obstacles including use of visual cues, checklists, flow charts. The paper also throws light on the importance of educating colleagues and work partners about the communication difficulties faced by the individual. This would help to reduce the communication barriers in the workplace, help colleagues develop an empathetic approach and also reduce misunderstandings that can arise as a result of the communication impairment.Keywords: vocational rehabilitation, disability, speech language pathologist, cognitive, linguistics
Procedia PDF Downloads 1351736 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models
Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu
Abstract:
Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging
Procedia PDF Downloads 1551735 Expert-Driving-Criteria Based on Fuzzy Logic Approach for Intelligent Driving Diagnosis
Authors: Andrés C. Cuervo Pinilla, Christian G. Quintero M., Chinthaka Premachandra
Abstract:
This paper considers people’s driving skills diagnosis under real driving conditions. In that sense, this research presents an approach that uses GPS signals which have a direct correlation with driving maneuvers. Besides, it is presented a novel expert-driving-criteria approximation using fuzzy logic which seeks to analyze GPS signals in order to issue an intelligent driving diagnosis. Based on above, this works presents in the first section the intelligent driving diagnosis system approach in terms of its own characteristics properties, explaining in detail significant considerations about how an expert-driving-criteria approximation must be developed. In the next section, the implementation of our developed system based on the proposed fuzzy logic approach is explained. Here, a proposed set of rules which corresponds to a quantitative abstraction of some traffics laws and driving secure techniques seeking to approach an expert-driving- criteria approximation is presented. Experimental testing has been performed in real driving conditions. The testing results show that the intelligent driving diagnosis system qualifies driver’s performance quantitatively with a high degree of reliability.Keywords: driver support systems, intelligent transportation systems, fuzzy logic, real time data processing
Procedia PDF Downloads 5181734 Mechanical Characterization and Metallography of Sintered Aluminium-Titanium Diboride Metal Matrix Composite
Authors: Sai Harshini Irigineni, Suresh Kumar Reddy Narala
Abstract:
The industrial applicability of aluminium metal matrix composites (AMMCs) has been rapidly growing due to their exceptional materials traits such as low weight, high strength, excellent thermal performance, and corrosion resistance. The increasing demand for AMMCs in automobile, aviation, aerospace and defence ventures has opened up windows of opportunity for the development of processing methods that facilitate low-cost production of AMMCs with superior properties. In the present work, owing to its economy, efficiency, and suitability, powder metallurgy (P/M) technique was employed to develop AMMCs with pure aluminium as matrix material and titanium diboride (TiB₂) as reinforcement. AMMC samples with different weight compositions (Al-0.1%TiB₂, Al-5%TiB₂, Al-10%TiB₂, and Al-15% TiB₂) were prepared through hot press compacting followed by traditional sintering. The developed AMMC was subjected to metallographic studies and mechanical characterization. Experimental evidences show significant improvement in mechanical properties such as tensile strength, hardness with increasing reinforcement content. The current study demonstrates the superiority of AMMCs over conventional metals and alloys and the results obtained may be of immense in material selection for different structural applications.Keywords: AMMCs, mechanical characterization, powder metallurgy, TiB₂
Procedia PDF Downloads 1311733 Mathematical Modeling of the Effect of Pretreatment on the Drying Kinetics, Energy Requirement and Physico-Functional Properties of Yam (Dioscorea Rotundata) and Cocoyam (Colocasia Esculenta)
Authors: Felix U. Asoiro, Kingsley O. Anyichie, Meshack I. Simeon, Chinenye E. Azuka
Abstract:
The work was aimed at studying the effects of microwave drying (450 W) and hot air oven drying on the drying kinetics and physico-functional properties of yams and cocoyams species. The yams and cocoyams were cut into chips of thicknesses of 3mm, 5mm, 7mm, 9mm, and 11mm. The drying characteristics of yam and cocoyam chips were investigated under microwave drying and hot air oven temperatures (50oC – 90oC). Drying methods, temperature, and thickness had a significant effect on the drying characteristics and physico-functional properties of yam and cocoyam. The result of the experiment showed that an increase in the temperature increased the drying time. The result also showed that the microwave drying method took lesser time to dry the samples than the hot air oven drying method. The iodine affinity of starch for yam was higher than that of cocoyam for the microwaved dried samples over those of hot air oven-dried samples. The results of the analysis would be useful in modeling the drying behavior of yams and cocoyams under different drying methods. It could also be useful in the improvement of shelf life for yams and cocoyams as well as designs of efficient systems for drying, handling, storage, packaging, processing, and transportation of yams and cocoyams.Keywords: coco yam, drying, microwave, modeling, energy consumption, iodine affinity, drying ate
Procedia PDF Downloads 1061732 Pyrolysis of Mixed Plastic Fractions with PP, PET and PA
Authors: Rudi P. Nielsen, Karina H. Hansen, Morten E. Simonsen
Abstract:
To improve the possibility of the chemical recycling of mixed plastic waste, such as municipal plastic waste, work has been conducted to gain an understanding of the effect of typical polymers from waste (PP, PET, and PA) on the quality of the pyrolysis oil produced. Plastic fractions were pyrolyzed in a lab-scale reactor system, with mixture compositions of up to 15 wt.% PET and five wt.% PA in a PP matrix and processing conditions from 400 to 450°C. The experiments were conducted as a full factorial design and in duplicates to provide reliable results and the possibility to determine any interactions between the parameters. The products were analyzed using FT-IR and GC-MS for compositional information as well as the determination of calorific value, ash content, acid number, density, viscosity, and elemental analysis to provide further data on the fuel quality of the pyrolysis oil. Oil yield was found to be between 61 and 84 wt.%, while char yield was below 2.6 wt.% in all cases. The calorific value of the produced oil was between 32 and 46 MJ/kg, averaging at approx. 41 MJ/kg, thus close to that of heavy fuel oil. The oil product was characterized to contain aliphatic and cyclic hydrocarbons, alcohols, and ethers with chain lengths between 10 and 25 carbon atoms. Overall, it was found that the addition of PET decreased oil yield, while the addition of both PA and PET decreased oil quality in general by increasing acid number (PET), decreasing calorific value (PA), and increasing nitrogen content (PA). Furthermore, it was identified that temperature increased ammonia production from PA during pyrolysis, while ammonia production was decreased by the addition of PET.Keywords: PET, plastic waste, polyamide, polypropylene, pyrolysis
Procedia PDF Downloads 1481731 The Pyrolysis of Leather and Textile Waste in Carbonised Materials as an Element of the Circular Economy Model
Authors: Maciej Życki, Anna Kowalik-klimczak, Monika Łożyńska, Wioletta Barszcz, Jolanta Drabik Anna Kowalik-klimczak
Abstract:
The rapidly changing fashion trends generate huge amounts of leather and textile waste globally. The complexity of these types of waste makes recycling difficult in economic terms. Pyrolysis is suggested for this purpose, which transforms heterogeneous and complex waste into added-value products e.g. active carbons and soil fertilizer. The possibility of using pyrolysis for the valorization of leather and textile waste has been analyzed in this paper. In the first stage, leather and textile waste were subjected to TG/DTG thermogravimetric and DSC calorimetric analysis. These analyses provided basic information about thermochemical transformations and degradation rates during the pyrolysis of these types of waste and enabled the selection of the pyrolysis temperature. In the next stage, the effect of gas type using pyrolysis was investigated on the physicochemical properties, composition, structure, and formation of the specific surfaces of carbonized materials produced by means of a thermal treatment without oxygen access to the reaction chamber. These studies contribute some data about the thermal management and pyrolytic processing of leather and textile waste into useful carbonized materials, according to the circular economy model.Keywords: pyrolysis, leather and textiles waste, composition and structure of carbonized materials, valorisation of waste, circular economy model
Procedia PDF Downloads 71730 Syntactic Analyzer for Tamil Language
Authors: Franklin Thambi Jose.S
Abstract:
Computational Linguistics is a branch of linguistics, which deals with the computer and linguistic levels. It is also said, as a branch of language studies which applies computer techniques to linguistics field. In Computational Linguistics, Natural Language Processing plays an important role. This came to exist because of the invention of Information Technology. In computational syntax, the syntactic analyser breaks a sentence into phrases and clauses and identifies the sentence with the syntactic information. Tamil is one of the major Dravidian languages, which has a very long written history of more than 2000 years. It is mainly spoken in Tamilnadu (in India), Srilanka, Malaysia and Singapore. It is an official language in Tamilnadu (in India), Srilanka, Malaysia and Singapore. In Malaysia Tamil speaking people are considered as an ethnic group. In Tamil syntax, the sentences in Tamil are classified into four for this research, namely: 1. Main Sentence 2. Interrogative Sentence 3. Equational Sentence 4. Elliptical Sentence. In computational syntax, the first step is to provide required information regarding the head and its constituent of each sentence. This information will be incorporated to the system using programming languages. Now the system can easily analyse a given sentence with the criteria or mechanisms given to it. Providing needful criteria or mechanisms to the computer to identify the basic types of sentences using Syntactic parser in Tamil language is the major objective of this paper.Keywords: tamil, syntax, criteria, sentences, parser
Procedia PDF Downloads 5171729 Evaluation of Hazelnut Hulls as an Alternative Forage Resource for Ruminant Animals
Authors: N. Cetinkaya, Y. S. Kuleyin
Abstract:
The aim of this study was to estimate the digestibility of the fruit internal skin of different varieties of hazelnuts to propose hazelnut fruit skin as an alternative feed source as roughage in ruminant nutrition. In 2015, the fruit internal skins of three different varieties of round hazelnuts (RH), pointed hazelnuts (PH) and almond hazelnuts (AH) were obtained from hazelnut processing factory then their crude nutrients analysis were carried out. Organic matter digestibility (OMD) and metabolisable energy (ME) values of hazelnut fruit skins were estimated from gas measured by in vitro gas production method. Their antioxidant activities were determined by spectrophotometric method. Crude nutrient values of three different varieties were; organic matter (OM): 87.83, 87.81 and 87.78%), crude protein (CP): 5.97, 5.93 and 5.89%, neutral detergent fiber (NDF): 30.30, 30.29 and 30.29%, acid detergent fiber (ADF): 48.68, 48.67 and 48.66% and acid detergent lignin (ADL): 25.43, 25.43 and 25.39% respectively. OMD from 24 h incubation time of RH, PH and AH were 22.04, 22.46 and 22.74%; MEGP values were 3.69, 3.75 and 3.79 MJ/kg DM; and antioxidant activity values were 94.60, 94.54 and 94.52 IC 50 mg/mL respectively. The fruit internal skin of different varieties of hazelnuts may be considered as an alternative roughage for ruminant nutrition regarding to their crude and digestible nutritive values. Moreover, hazelnut fruit skin has a rich antioxidant content so it may be used as a feed additive for both ruminant and non-ruminant animals.Keywords: antioxidant activity, hazelnut fruit skin, metabolizable energy, organic matter digestibility
Procedia PDF Downloads 3021728 Dynamic Model for Forecasting Rainfall Induced Landslides
Authors: R. Premasiri, W. A. H. A. Abeygunasekara, S. M. Hewavidana, T. Jananthan, R. M. S. Madawala, K. Vaheeshan
Abstract:
Forecasting the potential for disastrous events such as landslides has become one of the major necessities in the current world. Most of all, the landslides occurred in Sri Lanka are found to be triggered mostly by intense rainfall events. The study area is the landslide near Gerandiella waterfall which is located by the 41st kilometer post on Nuwara Eliya-Gampala main road in Kotmale Division in Sri Lanka. The landslide endangers the entire Kotmale town beneath the slope. Geographic Information System (GIS) platform is very much useful when it comes to the need of emulating the real-world processes. The models are used in a wide array of applications ranging from simple evaluations to the levels of forecast future events. This project investigates the possibility of developing a dynamic model to map the spatial distribution of the slope stability. The model incorporates several theoretical models including the infinite slope model, Green Ampt infiltration model and Perched ground water flow model. A series of rainfall values can be fed to the model as the main input to simulate the dynamics of slope stability. Hydrological model developed using GIS is used to quantify the perched water table height, which is one of the most critical parameters affecting the slope stability. Infinite slope stability model is used to quantify the degree of slope stability in terms of factor of safety. DEM was built with the use of digitized contour data. Stratigraphy was modeled in Surfer using borehole data and resistivity images. Data available from rainfall gauges and piezometers were used in calibrating the model. During the calibration, the parameters were adjusted until a good fit between the simulated ground water levels and the piezometer readings was obtained. This model equipped with the predicted rainfall values can be used to forecast of the slope dynamics of the area of interest. Therefore it can be investigated the slope stability of rainfall induced landslides by adjusting temporal dimensions.Keywords: factor of safety, geographic information system, hydrological model, slope stability
Procedia PDF Downloads 4231727 3D-Shape-Perception Studied Exemplarily with Tetrahedron and Icosahedron as Prototypes of the Polarities Sharp versus Round
Authors: Iris Sauerbrei, Jörg Trojan, Erich Lehner
Abstract:
Introduction and significance of the study: This study examines if three-dimensional shapes elicit distinct patterns of perceptions. If so, it is relevant for all fields of design, especially for the design of the built environment. Description of basic methodologies: The five platonic solids are the geometrical base for all other three-dimensional shapes, among which tetrahedron and icosahedron provide the clearest representation of the qualities sharp and round. The component pair of attributes ‘sharp versus round’ has already been examined in various surveys in a psychology of perception and in neuroscience by means of graphics, images of products of daily use, as well as by photographs and walk-through-videos of landscapes and architecture. To verify a transfer of outcomes of the existing surveys to the perception of three-dimensional shapes, walk-in models (total height 2.2m) of tetrahedron and icosahedron were set up in a public park in Frankfurt am Main, Germany. Preferences of park visitors were tested by questionnaire; also they were asked to write down associations in a free text. In summer 2015, the tetrahedron was assembled eight times, the icosahedron seven times. In total 288 participants took part in the study; 116 rated the tetrahedron, 172 rated the icosahedron. Findings: Preliminary analyses of the collected data using Wilcoxon Rank-Sum tests show that the perceptions of the two solids differ in respect to several attributes and that each of the tested model show significance for specific attributes. Conclusion: These findings confirm the assumptions and provide first evidence that the perception of three-dimensional shapes are associated to characteristic attributes and to which. In order to enable conscious choices for spatial arrangements in design processes for the built environment, future studies should examine attributes for the other three basic bodies - Octahedron, Cube, and Dodecahedron. Additionally, similarities and differences between the perceptions of two- and three-dimensional shapes as well as shapes that are more complex need further research.Keywords: 3D shapes, architecture, geometrical features, space perception, walk-in models
Procedia PDF Downloads 2281726 Field Emission Scanning Microscope Image Analysis for Porosity Characterization of Autoclaved Aerated Concrete
Authors: Venuka Kuruwita Arachchige Don, Mohamed Shaheen, Chris Goodier
Abstract:
Aerated autoclaved concrete (AAC) is known for its lightweight, easy handling, high thermal insulation, and extremely porous structure. Investigation of pore behavior in AAC is crucial for characterizing the material, standardizing design and production techniques, enhancing the mechanical, durability, and thermal performance, studying the effectiveness of protective measures, and analyzing the effects of weather conditions. The significant details of pores are complicated to observe with acknowledged accuracy. The High-resolution Field Emission Scanning Electron Microscope (FESEM) image analysis is a promising technique for investigating the pore behavior and density of AAC, which is adopted in this study. Mercury intrusion porosimeter and gas pycnometer were employed to characterize porosity distribution and density parameters. The analysis considered three different densities of AAC blocks and three layers in the altitude direction within each block. A set of understandings was presented to extract and analyze the details of pore shape, pore size, pore connectivity, and pore percentages from FESEM images of AAC. Average pore behavior outcomes per unit area were presented. Comparison of porosity distribution and density parameters revealed significant variations. FESEM imaging offered unparalleled insights into porosity behavior, surpassing the capabilities of other techniques. The analysis conducted from a multi-staged approach provides porosity percentage occupied by various pore categories, total porosity, variation of pore distribution compared to AAC densities and layers, number of two-dimensional and three-dimensional pores, variation of apparent and matrix densities concerning pore behaviors, variation of pore behavior with respect to aluminum content, and relationship among shape, diameter, connectivity, and percentage in each pore classification.Keywords: autoclaved aerated concrete, density, imaging technique, microstructure, porosity behavior
Procedia PDF Downloads 691725 Reliability and Cost Focused Optimization Approach for a Communication Satellite Payload Redundancy Allocation Problem
Authors: Mehmet Nefes, Selman Demirel, Hasan H. Ertok, Cenk Sen
Abstract:
A typical reliability engineering problem regarding communication satellites has been considered to determine redundancy allocation scheme of power amplifiers within payload transponder module, whose dominant function is to amplify power levels of the received signals from the Earth, through maximizing reliability against mass, power, and other technical limitations. Adding each redundant power amplifier component increases not only reliability but also hardware, testing, and launch cost of a satellite. This study investigates a multi-objective approach used in order to solve Redundancy Allocation Problem (RAP) for a communication satellite payload transponder, focusing on design cost due to redundancy and reliability factors. The main purpose is to find the optimum power amplifier redundancy configuration satisfying reliability and capacity thresholds simultaneously instead of analyzing respectively or independently. A mathematical model and calculation approach are instituted including objective function definitions, and then, the problem is solved analytically with different input parameters in MATLAB environment. Example results showed that payload capacity and failure rate of power amplifiers have remarkable effects on the solution and also processing time.Keywords: communication satellite payload, multi-objective optimization, redundancy allocation problem, reliability, transponder
Procedia PDF Downloads 2611724 The Influence of Positive and Negative Affect on Perception and Judgement
Authors: Annamarija Paula
Abstract:
Modern psychology is divided into three distinct domains: cognition, affect, and conation. Historically, psychology devalued the importance of studying the effect in order to explain human behavior as it supposedly lacked both rational thought and a scientific foundation. As a result, affect remained the least studied domain for years to come. However, the last 30 years have marked a significant change in perspective, claiming that not only is affect highly adaptive, but it also plays a crucial role in cognitive processes. Affective states have a crucial impact on human behavior, which led to fundamental advances in the study of affective states on perception and judgment. Positive affect and negative affect are distinct entities and have different effects on social information processing. In addition, emotions of the same valence are manifested in distinct and unique physiological reactions indicating that not all forms of positive or negative affect are the same or serve the same purpose. The effect plays a vital role in perception and judgments, which impacts the validity and reliability of memory retrieval. The research paper analyzes key findings from the past three decades of observational and empirical research on affective states and cognition. The paper also addresses the limitations connected to the findings and proposes suggestions for possible future research.Keywords: memory, affect, perception, judgement, mood congruency effect
Procedia PDF Downloads 1301723 Ill-Posed Inverse Problems in Molecular Imaging
Authors: Ranadhir Roy
Abstract:
Inverse problems arise in medical (molecular) imaging. These problems are characterized by large in three dimensions, and by the diffusion equation which models the physical phenomena within the media. The inverse problems are posed as a nonlinear optimization where the unknown parameters are found by minimizing the difference between the predicted data and the measured data. To obtain a unique and stable solution to an ill-posed inverse problem, a priori information must be used. Mathematical conditions to obtain stable solutions are established in Tikhonov’s regularization method, where the a priori information is introduced via a stabilizing functional, which may be designed to incorporate some relevant information of an inverse problem. Effective determination of the Tikhonov regularization parameter requires knowledge of the true solution, or in the case of optical imaging, the true image. Yet, in, clinically-based imaging, true image is not known. To alleviate these difficulties we have applied the penalty/modified barrier function (PMBF) method instead of Tikhonov regularization technique to make the inverse problems well-posed. Unlike the Tikhonov regularization method, the constrained optimization technique, which is based on simple bounds of the optical parameter properties of the tissue, can easily be implemented in the PMBF method. Imposing the constraints on the optical properties of the tissue explicitly restricts solution sets and can restore uniqueness. Like the Tikhonov regularization method, the PMBF method limits the size of the condition number of the Hessian matrix of the given objective function. The accuracy and the rapid convergence of the PMBF method require a good initial guess of the Lagrange multipliers. To obtain the initial guess of the multipliers, we use a least square unconstrained minimization problem. Three-dimensional images of fluorescence absorption coefficients and lifetimes were reconstructed from contact and noncontact experimentally measured data.Keywords: constrained minimization, ill-conditioned inverse problems, Tikhonov regularization method, penalty modified barrier function method
Procedia PDF Downloads 2711722 Spectral Response Measurements and Materials Analysis of Ageing Solar Photovoltaic Modules
Authors: T. H. Huang, C. Y. Gao, C. H. Lin, J. L. Kwo, Y. K. Tseng
Abstract:
The design and reliability of solar photovoltaic modules are crucial to the development of solar energy, and efforts are still being made to extend the life of photovoltaic modules to improve their efficiency because natural aging is time-consuming and does not provide manufacturers and investors with timely information, accelerated aging is currently the best way to estimate the life of photovoltaic modules. In this study, the accelerated aging of different light sources was combined with spectral response measurements to understand the effect of light sources on aging tests. In this study, there are two types of experimental samples: packaged and unpackaged and then irradiated with full-spectrum and UVC light sources for accelerated aging, as well as a control group without aging. The full-spectrum aging was performed by irradiating the solar cell with a xenon lamp like the solar spectrum for two weeks, while the accelerated aging was performed by irradiating the solar cell with a UVC lamp for two weeks. The samples were first visually observed, and infrared thermal images were taken, and then the electrical (IV) and Spectral Responsivity (SR) data were obtained by measuring the spectral response of the samples, followed by Scanning Electron Microscopy (SEM), Raman spectroscopy (Raman), and X-ray Diffraction (XRD) analysis. The results of electrical (IV) and Spectral Responsivity (SR) and material analyses were used to compare the differences between packaged and unpackaged solar cells with full spectral aging, accelerated UVC aging, and unaged solar cells. The main objective of this study is to compare the difference in the aging of packaged and unpackaged solar cells by irradiating different light sources. We determined by infrared thermal imaging that both full-spectrum aging and UVC accelerated aging increase the defects of solar cells, and IV measurements demonstrated that the conversion efficiency of solar cells decreases after full-spectrum aging and UVC accelerated aging. SEM observed some scorch marks on both unpackaged UVC accelerated aging solar cells and unpackaged full-spectrum aging solar cells. Raman spectroscopy examines the Si intensity of solar cells, and XRD confirms the crystallinity of solar cells by the intensity of Si and Ag winding peaks.Keywords: solar cell, aging, spectral response measurement
Procedia PDF Downloads 1031721 Quality Assurances for an On-Board Imaging System of a Linear Accelerator: Five Months Data Analysis
Authors: Liyun Chang, Cheng-Hsiang Tsai
Abstract:
To ensure the radiation precisely delivering to the target of cancer patients, the linear accelerator equipped with the pretreatment on-board imaging system is introduced and through it the patient setup is verified before the daily treatment. New generation radiotherapy using beam-intensity modulation, usually associated the treatment with steep dose gradients, claimed to have achieved both a higher degree of dose conformation in the targets and a further reduction of toxicity in normal tissues. However, this benefit is counterproductive if the beam is delivered imprecisely. To avoid shooting critical organs or normal tissues rather than the target, it is very important to carry out the quality assurance (QA) of this on-board imaging system. The QA of the On-Board Imager® (OBI) system of one Varian Clinac-iX linear accelerator was performed through our procedures modified from a relevant report and AAPM TG142. Two image modalities, 2D radiography and 3D cone-beam computed tomography (CBCT), of the OBI system were examined. The daily and monthly QA was executed for five months in the categories of safety, geometrical accuracy and image quality. A marker phantom and a blade calibration plate were used for the QA of geometrical accuracy, while the Leeds phantom and Catphan 504 phantom were used in the QA of radiographic and CBCT image quality, respectively. The reference images were generated through a GE LightSpeed CT simulator with an ADAC Pinnacle treatment planning system. Finally, the image quality was analyzed via an OsiriX medical imaging system. For the geometrical accuracy test, the average deviations of the OBI isocenter in each direction are less than 0.6 mm with uncertainties less than 0.2 mm, while all the other items have the displacements less than 1 mm. For radiographic image quality, the spatial resolution is 1.6 lp/cm with contrasts less than 2.2%. The spatial resolution, low contrast, and HU homogenous of CBCT are larger than 6 lp/cm, less than 1% and within 20 HU, respectively. All tests are within the criteria, except the HU value of Teflon measured with the full fan mode exceeding the suggested value that could be due to itself high HU value and needed to be rechecked. The OBI system in our facility was then demonstrated to be reliable with stable image quality. The QA of OBI system is really necessary to achieve the best treatment for a patient.Keywords: CBCT, image quality, quality assurance, OBI
Procedia PDF Downloads 2981720 Using T-Splines to Model Point Clouds from Terrestrial Laser Scanner
Authors: G. Kermarrec, J. Hartmann
Abstract:
Spline surfaces are a major representation of freeform surfaces in the computer-aided graphic industry and were recently introduced in the field of geodesy for processing point clouds from terrestrial laser scanner (TLS). The surface fitting consists of approximating a trustworthy mathematical surface to a large numbered 3D point cloud. The standard B-spline surfaces lack of local refinement due to the tensor-product construction. The consequences are oscillating geometry, particularly in the transition from low-to-high curvature parts for scattered point clouds with missing data. More economic alternatives in terms of parameters on how to handle point clouds with a huge amount of observations are the recently introduced T-splines. As long as the partition of unity is guaranteed, their computational complexity is low, and they are flexible. T-splines are implemented in a commercial package called Rhino, a 3D modeler which is widely used in computer aided design to create and animate NURBS objects. We have applied T-splines surface fitting to terrestrial laser scanner point clouds from a bridge under load and a sheet pile wall with noisy observations. We will highlight their potential for modelling details with high trustworthiness, paving the way for further applications in terms of deformation analysis.Keywords: deformation analysis, surface modelling, terrestrial laser scanner, T-splines
Procedia PDF Downloads 1401719 Same-Day Detection Method of Salmonella Spp., Shigella Spp. and Listeria Monocytogenes with Fluorescence-Based Triplex Real-Time PCR
Authors: Ergun Sakalar, Kubra Bilgic
Abstract:
Faster detection and characterization of pathogens are the basis of the evoid from foodborne pathogens. Salmonella spp., Shigella spp. and Listeria monocytogenes are common foodborne bacteria that are among the most life-threatining. It is important to rapid and accurate detection of these pathogens to prevent food poisoning and outbreaks or to manage food chains. The present work promise to develop a sensitive, species specific and reliable PCR based detection system for simultaneous detection of Salmonella spp., Shigella spp. and Listeria monocytogenes. For this purpose, three genes were picked out, ompC for Salmonella spp., ipaH for Shigella spp. and hlyA for L. monocytogenes. After short pre-enrichment of milk was passed through a vacuum filter and bacterial DNA was exracted using commercially available kit GIDAGEN®(Turkey, İstanbul). Detection of amplicons was verified by examination of the melting temperature (Tm) that are 72° C, 78° C, 82° C for Salmonella spp., Shigella spp. and L. monocytogenes, respectively. The method specificity was checked against a group of bacteria strains, and also carried out sensitivity test resulting in under 10² CFU mL⁻¹ of milk for each bacteria strain. Our results show that the flourescence based triplex qPCR method can be used routinely to detect Salmonella spp., Shigella spp. and L. monocytogenes during the milk processing procedures in order to reduce cost, time of analysis and the risk of foodborne disease outbreaks.Keywords: evagreen, food-born bacteria, pathogen detection, real-time pcr
Procedia PDF Downloads 2441718 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources
Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha
Abstract:
Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models
Procedia PDF Downloads 2111717 Application of Hydrologic Engineering Centers and River Analysis System Model for Hydrodynamic Analysis of Arial Khan River
Authors: Najeeb Hassan, Mahmudur Rahman
Abstract:
Arial Khan River is one of the main south-eastward outlets of the River Padma. This river maintains a meander channel through its course and is erosional in nature. The specific objective of the research is to study and evaluate the hydrological characteristics in the form of assessing changes of cross-sections, discharge, water level and velocity profile in different stations and to create a hydrodynamic model of the Arial Khan River. Necessary data have been collected from Bangladesh Water Development Board (BWDB) and Center for Environment and Geographic Information Services (CEGIS). Satellite images have been observed from Google earth. In this study, hydrodynamic model of Arial Khan River has been developed using well known steady open channel flow code Hydrologic Engineering Centers and River Analysis System (HEC-RAS) using field surveyed geometric data. Cross-section properties at 22 locations of River Arial Khan for the years 2011, 2013 and 2015 were also analysed. 1-D HEC-RAS model has been developed using the cross sectional data of 2015 and appropriate boundary condition is being used to run the model. This Arial Khan River model is calibrated using the pick discharge of 2015. The applicable value of Mannings roughness coefficient (n) is adjusted through the process of calibration. The value of water level which ties with the observed data to an acceptable accuracy is taken as calibrated model. The 1-D HEC-RAS model then validated by using the pick discharges from 2009-2018. Variation in observed water level in the model and collected water level data is being compared to validate the model. It is observed that due to seasonal variation, discharge of the river changes rapidly and Mannings roughness coefficient (n) also changes due to the vegetation growth along the river banks. This river model may act as a tool to measure flood area in future. By considering the past pick flow discharge, it is strongly recommended to improve the carrying capacity of Arial Khan River to protect the surrounding areas from flash flood.Keywords: BWDB, CEGIS, HEC-RAS
Procedia PDF Downloads 1841716 PsyVBot: Chatbot for Accurate Depression Diagnosis using Long Short-Term Memory and NLP
Authors: Thaveesha Dheerasekera, Dileeka Sandamali Alwis
Abstract:
The escalating prevalence of mental health issues, such as depression and suicidal ideation, is a matter of significant global concern. It is plausible that a variety of factors, such as life events, social isolation, and preexisting physiological or psychological health conditions, could instigate or exacerbate these conditions. Traditional approaches to diagnosing depression entail a considerable amount of time and necessitate the involvement of adept practitioners. This underscores the necessity for automated systems capable of promptly detecting and diagnosing symptoms of depression. The PsyVBot system employs sophisticated natural language processing and machine learning methodologies, including the use of the NLTK toolkit for dataset preprocessing and the utilization of a Long Short-Term Memory (LSTM) model. The PsyVBot exhibits a remarkable ability to diagnose depression with a 94% accuracy rate through the analysis of user input. Consequently, this resource proves to be efficacious for individuals, particularly those enrolled in academic institutions, who may encounter challenges pertaining to their psychological well-being. The PsyVBot employs a Long Short-Term Memory (LSTM) model that comprises a total of three layers, namely an embedding layer, an LSTM layer, and a dense layer. The stratification of these layers facilitates a precise examination of linguistic patterns that are associated with the condition of depression. The PsyVBot has the capability to accurately assess an individual's level of depression through the identification of linguistic and contextual cues. The task is achieved via a rigorous training regimen, which is executed by utilizing a dataset comprising information sourced from the subreddit r/SuicideWatch. The diverse data present in the dataset ensures precise and delicate identification of symptoms linked with depression, thereby guaranteeing accuracy. PsyVBot not only possesses diagnostic capabilities but also enhances the user experience through the utilization of audio outputs. This feature enables users to engage in more captivating and interactive interactions. The PsyVBot platform offers individuals the opportunity to conveniently diagnose mental health challenges through a confidential and user-friendly interface. Regarding the advancement of PsyVBot, maintaining user confidentiality and upholding ethical principles are of paramount significance. It is imperative to note that diligent efforts are undertaken to adhere to ethical standards, thereby safeguarding the confidentiality of user information and ensuring its security. Moreover, the chatbot fosters a conducive atmosphere that is supportive and compassionate, thereby promoting psychological welfare. In brief, PsyVBot is an automated conversational agent that utilizes an LSTM model to assess the level of depression in accordance with the input provided by the user. The demonstrated accuracy rate of 94% serves as a promising indication of the potential efficacy of employing natural language processing and machine learning techniques in tackling challenges associated with mental health. The reliability of PsyVBot is further improved by the fact that it makes use of the Reddit dataset and incorporates Natural Language Toolkit (NLTK) for preprocessing. PsyVBot represents a pioneering and user-centric solution that furnishes an easily accessible and confidential medium for seeking assistance. The present platform is offered as a modality to tackle the pervasive issue of depression and the contemplation of suicide.Keywords: chatbot, depression diagnosis, LSTM model, natural language process
Procedia PDF Downloads 69