Search results for: thermal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7007

Search results for: thermal processing

4697 Music Reading Expertise Facilitates Implicit Statistical Learning of Sentence Structures in a Novel Language: Evidence from Eye Movement Behavior

Authors: Sara T. K. Li, Belinda H. J. Chung, Jeffery C. N. Yip, Janet H. Hsiao

Abstract:

Music notation and text reading both involve statistical learning of music or linguistic structures. However, it remains unclear how music reading expertise influences text reading behavior. The present study examined this issue through an eye-tracking study. Chinese-English bilingual musicians and non-musicians read English sentences, Chinese sentences, musical phrases, and sentences in Tibetan, a language novel to the participants, with their eye movement recorded. Each set of stimuli consisted of two conditions in terms of structural regularity: syntactically correct and syntactically incorrect musical phrases/sentences. They then completed a sentence comprehension (for syntactically correct sentences) or a musical segment/word recognition task afterwards to test their comprehension/recognition abilities. The results showed that in reading musical phrases, as compared with non-musicians, musicians had a higher accuracy in the recognition task, and had shorter reading time, fewer fixations, and shorter fixation duration when reading syntactically correct (i.e., in diatonic key) than incorrect (i.e., in non-diatonic key/atonal) musical phrases. This result reflects their expertise in music reading. Interestingly, in reading Tibetan sentences, which was novel to both participant groups, while non-musicians did not show any behavior differences between reading syntactically correct or incorrect Tibetan sentences, musicians showed a shorter reading time and had marginally fewer fixations when reading syntactically correct sentences than syntactically incorrect ones. However, none of the musicians reported discovering any structural regularities in the Tibetan stimuli after the experiment when being asked explicitly, suggesting that they may have implicitly acquired the structural regularities in Tibetan sentences. This group difference was not observed when they read English or Chinese sentences. This result suggests that music reading expertise facilities reading texts in a novel language (i.e., Tibetan), but not in languages that the readers are already familiar with (i.e., English and Chinese). This phenomenon may be due to the similarities between reading music notations and reading texts in a novel language, as in both cases the stimuli follow particular statistical structures but do not involve semantic or lexical processing. Thus, musicians may transfer their statistical learning skills stemmed from music notation reading experience to implicitly discover structures of sentences in a novel language. This speculation is consistent with a recent finding showing that music reading expertise modulates the processing of English nonwords (i.e., words that do not follow morphological or orthographic rules) but not pseudo- or real words. These results suggest that the modulation of music reading expertise on language processing depends on the similarities in the cognitive processes involved. It also has important implications for the benefits of music education on language and cognitive development.

Keywords: eye movement behavior, eye-tracking, music reading expertise, sentence reading, structural regularity, visual processing

Procedia PDF Downloads 380
4696 Traditional Ceramics Value in the Middle East

Authors: Abdelmessih Malak Sadek Labib

Abstract:

The Stability in harsh environments thanks to excellent electrical, mechanical and thermal properties is what ceramics are all about selected materials for many applications despite advent of new materials such as plastics and composites. However, ceramic materials have disadvantages, including brittleness. Fragility is often attributed to pottery strong covalent and ionic bonds in the ceramic body. There is still much to learn about brittle cracks in a attention to detail, hence the fragility of the ceramic and its catastrophic failure of a frequently studied topic, particularly in charging applications. One of the most commonly used ceramics for load-bearing applications such as veneers is porcelain. Porcelain is a type of traditional pottery. Traditional pottery consists mainly of three basic ingredients: clay, which gives plasticity; silica which maintains the shape and stability of the ceramic body over temperature high temperature; and feldspar affecting glazing. In traditional pottery, the inversion of quartz during cooling the process can create microcracks that act as a stress concentration centers. Consequently, subcritical crack growth is caused due to quartz inversion origins unpredictable catastrophic failure of the work of ceramic bodies when reloading. In the case of porcelain, however, this is what the mullite hypothesis says the strength of porcelain can be significantly increased with felt Interlocking of mullite needles in the ceramic body.in this way realistic assessment of the role of quartz and mullite Porcelain with a strength of is needed to grow stronger and smaller fragile porcelain. Currently,the lack of reports on Young's moduli in the literature leads to erroneous conclusions in this regard mechanical behavior of porcelain. Therefore, the current project uses the Young's modulus approach for the investigation the role of quartz and mullite on the mechanical strength of various porcelains, in addition to reducing particle size, flexural strength fractographic forces and techniques.

Keywords: materials, technical, ceramics, properties, thermal, stability, advantages

Procedia PDF Downloads 84
4695 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction

Authors: Qais M. Yousef, Yasmeen A. Alshaer

Abstract:

Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.

Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization

Procedia PDF Downloads 175
4694 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform

Authors: David Jurado, Carlos Ávila

Abstract:

Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.

Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis

Procedia PDF Downloads 83
4693 Contribution of Remote Sensing and GIS to the Study of the Impact of the Salinity of Sebkhas on the Quality of Groundwater: Case of Sebkhet Halk El Menjel (Sousse)

Authors: Gannouni Sonia, Hammami Asma, Saidi Salwa, Rebai Noamen

Abstract:

Water resources in Tunisia have experienced quantitative and qualitative degradation, especially when talking about wetlands and Sbekhas. Indeed, the objective of this work is to study the spatio-temporal evolution of salinity for 29 years (from 1987 to 2016). A study of the connection between surface water and groundwater is necessary to know the degree of influence of the Sebkha brines on the water table. The evolution of surface salinity is determined by remote sensing based on Landsat TM and OLI/TIRS satellite images of the years 1987, 2007, 2010, and 2016. The processing of these images allowed us to determine the NDVI(Normalized Difference Vegetation Index), the salinity index, and the surface temperature around Sebkha. In addition, through a geographic information system(GIS), we could establish a map of the distribution of salinity in the subsurface of the water table of Chott Mariem and Hergla/SidiBouAli/Kondar. The results of image processing and the calculation of the index and surface temperature show an increase in salinity downstream of in addition to the sebkha and the development of vegetation cover upstream and the western part of the sebkha. This richness may be due both to contamination by seawater infiltration from the barrier beach of Hergla as well as the passage of groundwater to the sebkha.

Keywords: spatio-temporal monitoring, salinity, satellite images, NDVI, sebkha

Procedia PDF Downloads 133
4692 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 389
4691 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran

Authors: M. Ahmadi, M. Kafil, H. Ebrahimi

Abstract:

Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.

Keywords: broken bar, condition monitoring, diagnostics, empirical mode decomposition, fourier transform, wavelet transform

Procedia PDF Downloads 150
4690 Modernization of Garri-Frying Technologies with Respect to Women Anthromophic Quality in Nigeria

Authors: Adegbite Bashiru Adeniyi, Olaniyi Akeem Olawale, Ayobamidele Sinatu Juliet

Abstract:

The study was carried out in the 6 South Western states of Nigeria to analyze socio-economic characteristic of garri processors and their anthropometric qualities with respect to modern technologies used in garri processing. About 20 respondents were randomly selected from each of the 6 workstations purposively considered for the study due to their daily processing activities already attracted high patronage of customers. These include Oguntolu village (Ogun State), Igoba-Akure (Ondo State), Imo-Ilesa (Osun State), Odo Oba-Ileri (Oyo State), Irasa village (Ekiti State) and Epe in Lagos state. Interview schedule was conducted for 120 respondents to elicit information. Data were analyzed using descriptive statistical tools. It was observed from the findings that respondents were in their most productive age range (36-45 years) except Ogun state where majority (45%) were relatively older than 45 years. A fewer processors were much younger than 26 years old. It furthers revealed that not less than 55% have body weight greater than 50.0 kilogram, also not less than 70% were taller than 1.5 meter. So also, the hand length and hand thickness of the majority were long and bulky which are considered suitable for operating some modern and improved technologies in garri-frying process. This information could be used by various technological developers to enhance production of modern equipment and tools for a greater efficiency.

Keywords: agro-business, anthromorphic, modernization, proficiency

Procedia PDF Downloads 512
4689 Instant Location Detection of Objects Moving at High Speed in C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data off the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as 'signaling parameters' (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of C-OTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as a rule. This report contains describing the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.

Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems

Procedia PDF Downloads 470
4688 Drivers of Farmers' Contract Compliance Behaviour: Evidence from a Case Study of Dangote Tomato Processing Plant in Northern Nigeria.

Authors: Umar Shehu Umar

Abstract:

Contract farming is a viable strategy agribusinesses rely on to strengthen vertical coordination. However, low contract compliance remains a significant setback to agribusinesses' contract performance. The present study aims to understand what drives smallholder farmers’ contract compliance behaviour. Qualitative information was collected through Focus Group Discussions to enrich the design of the survey questionnaire administered on a sample of 300 randomly selected farmers contracted by the Dangote Tomato Processing Plant (DTPP) in four regions of northern Nigeria. Novel transaction level data of tomato sales covering one season were collected in addition to socio-economic information of the sampled farmers. Binary logistic model results revealed that open fresh market tomato prices and payment delays negatively affect farmers' compliance behaviour while quantity harvested, education level and input provision correlated positively with compliance. The study suggests that contract compliance will increase if contracting firms devise a reliable and timely payment plan (e.g., digital payment), continue input and service provisions (e.g., improved seeds, extension services) and incentives (e.g., loyalty rewards, bonuses) in the contract.

Keywords: contract farming, compliance, farmers and processors., smallholder

Procedia PDF Downloads 56
4687 Preparation and Characterization of Poly(L-Lactic Acid)/Oligo(D-Lactic Acid) Grafted Cellulose Composites

Authors: Md. Hafezur Rahaman, Mohd. Maniruzzaman, Md. Shadiqul Islam, Md. Masud Rana

Abstract:

With the growth of environmental awareness, enormous researches are running to develop the next generation materials based on sustainability, eco-competence, and green chemistry to preserve and protect the environment. Due to biodegradability and biocompatibility, poly (L-lactic acid) (PLLA) has a great interest in ecological and medical applications. Also, cellulose is one of the most abundant biodegradable, renewable polymers found in nature. It has several advantages such as low cost, high mechanical strength, biodegradability and so on. Recently, an immense deal of attention has been paid for the scientific and technological development of α-cellulose based composite material. PLLA could be used for grafting of cellulose to improve the compatibility prior to the composite preparation. Here it is quite difficult to form a bond between lower hydrophilic molecules like PLLA and α-cellulose. Dimmers and oligomers can easily be grafted onto the surface of the cellulose by ring opening or polycondensation method due to their low molecular weight. In this research, α-cellulose extracted from jute fiber is grafted with oligo(D-lactic acid) (ODLA) via graft polycondensation reaction in presence of para-toluene sulphonic acid and potassium persulphate in toluene at 130°C for 9 hours under 380 mmHg. Here ODLA is synthesized by ring opening polymerization of D-lactides in the presence of stannous octoate (0.03 wt% of lactide) and D-lactic acids at 140°C for 10 hours. Composites of PLLA with ODLA grafted α-cellulose are prepared by solution mixing and film casting method. Confirmation of grafting was carried out through FTIR spectroscopy and SEM analysis. A strongest carbonyl peak of FTIR spectroscopy at 1728 cm⁻¹ of ODLA grafted α-cellulose confirms the grafting of ODLA onto α-cellulose which is absent in α-cellulose. It is also observed from SEM photographs that there are some white areas (spot) on ODLA grafted α-cellulose as compared to α-cellulose may indicate the grafting of ODLA and consistent with FTIR results. Analysis of the composites is carried out by FTIR, SEM, WAXD and thermal gravimetric analyzer. Most of the FTIR characteristic absorption peak of the composites shifted to higher wave number with increasing peak area may provide a confirmation that PLLA and grafted cellulose have better compatibility in composites via intermolecular hydrogen bonding and this supports previously published results. Grafted α-cellulose distributions in composites are uniform which is observed by SEM analysis. WAXD studied show that only homo-crystalline structures of PLLA present in the composites. Thermal stability of the composites is enhanced with increasing the percentages of ODLA grafted α-cellulose in composites. As a consequence, the resultant composites have a resistance toward the thermal degradation. The effects of length of the grafted chain and biodegradability of the composites will be studied in further research.

Keywords: α-cellulose, composite, graft polycondensation, oligo(D-lactic acid), poly(L-lactic acid)

Procedia PDF Downloads 117
4686 University Building: Discussion about the Effect of Numerical Modelling Assumptions for Occupant Behavior

Authors: Fabrizio Ascione, Martina Borrelli, Rosa Francesca De Masi, Silvia Ruggiero, Giuseppe Peter Vanoli

Abstract:

The refurbishment of public buildings is one of the key factors of energy efficiency policy of European States. Educational buildings account for the largest share of the oldest edifice with interesting potentialities for demonstrating best practice with regards to high performance and low and zero-carbon design and for becoming exemplar cases within the community. In this context, this paper discusses the critical issue of dealing the energy refurbishment of a university building in heating dominated climate of South Italy. More in detail, the importance of using validated models will be examined exhaustively by proposing an analysis on uncertainties due to modelling assumptions mainly referring to the adoption of stochastic schedules for occupant behavior and equipment or lighting usage. Indeed, today, the great part of commercial tools provides to designers a library of possible schedules with which thermal zones can be described. Very often, the users do not pay close attention to diversify thermal zones and to modify or to adapt predefined profiles, and results of designing are affected positively or negatively without any alarm about it. Data such as occupancy schedules, internal loads and the interaction between people and windows or plant systems, represent some of the largest variables during the energy modelling and to understand calibration results. This is mainly due to the adoption of discrete standardized and conventional schedules with important consequences on the prevision of the energy consumptions. The problem is surely difficult to examine and to solve. In this paper, a sensitivity analysis is presented, to understand what is the order of magnitude of error that is committed by varying the deterministic schedules used for occupation, internal load, and lighting system. This could be a typical uncertainty for a case study as the presented one where there is not a regulation system for the HVAC system thus the occupant cannot interact with it. More in detail, starting from adopted schedules, created according to questioner’ s responses and that has allowed a good calibration of energy simulation model, several different scenarios are tested. Two type of analysis are presented: the reference building is compared with these scenarios in term of percentage difference on the projected total electric energy need and natural gas request. Then the different entries of consumption are analyzed and for more interesting cases also the comparison between calibration indexes. Moreover, for the optimal refurbishment solution, the same simulations are done. The variation on the provision of energy saving and global cost reduction is evidenced. This parametric study wants to underline the effect on performance indexes evaluation of the modelling assumptions during the description of thermal zones.

Keywords: energy simulation, modelling calibration, occupant behavior, university building

Procedia PDF Downloads 141
4685 Tracking and Classifying Client Interactions with Personal Coaches

Authors: Kartik Thakore, Anna-Roza Tamas, Adam Cole

Abstract:

The world health organization (WHO) reports that by 2030 more than 23.7 million deaths annually will be caused by Cardiovascular Diseases (CVDs); with a 2008 economic impact of $3.76 T. Metabolic syndrome is a disorder of multiple metabolic risk factors strongly indicated in the development of cardiovascular diseases. Guided lifestyle intervention driven by live coaching has been shown to have a positive impact on metabolic risk factors. Individuals’ path to improved (decreased) metabolic risk factors are driven by personal motivation and personalized messages delivered by coaches and augmented by technology. Using interactions captured between 400 individuals and 3 coaches over a program period of 500 days, a preliminary model was designed. A novel real time event tracking system was created to track and classify clients based on their genetic profile, baseline questionnaires and usage of a mobile application with live coaching sessions. Classification of clients and coaches was done using a support vector machines application build on Apache Spark, Stanford Natural Language Processing Library (SNLPL) and decision-modeling.

Keywords: guided lifestyle intervention, metabolic risk factors, personal coaching, support vector machines application, Apache Spark, natural language processing

Procedia PDF Downloads 433
4684 Evaluation of Different Cowpea Genotypes Using Grain Yield and Canning Quality Traits

Authors: Magdeline Pakeng Mohlala, R. L. Molatudi, M. A. Mofokeng

Abstract:

Cowpea (Vigna unguiculata (L.) Walp) is an important annual leguminous crop in semi-arid and tropics. Most of cowpea grain production in South Africa is mainly used for domestic consumption, as seed planting and little or none gets to be used in industrial processing; thus, there is a need to expand the utilization of cowpea through industrial processing. Agronomic traits contribute to the understanding of the association between yield and its component traits to facilitate effective selection for yield improvement. The aim of this study was to evaluate cowpea genotypes using grain yield and canning quality traits. The field experiment was conducted in two locations in Limpopo Province, namely Syferkuil Agricultural Experimental farm and Ga-Molepo village during 2017/2018 growing season and canning took place at ARC-Grain Crops Potchefstroom. The experiment comprised of 100 cowpea genotypes laid out in a Randomized Complete Block Designs (RCBD). The grain yield, yield components, and canning quality traits were analysed using Genstat software. About 62 genotypes were suitable for canning, 38 were not due to their seed coat texture, and water uptake was less than 80% resulting in too soft (mushy) seeds. Grain yield for RV115, 99k-494-6, ITOOK1263, RV111, RV353 and 53 other genotypes recorded high positive association with number of branches, pods per plant, and number of seeds per pod, unshelled weight and shelled weight for Syferkuil than at Ga-Molepo are therefore recommended for canning quality.

Keywords: agronomic traits, canning quality, genotypes, yield

Procedia PDF Downloads 152
4683 Energy System Analysis Using Data-Driven Modelling and Bayesian Methods

Authors: Paul Rowley, Adam Thirkill, Nick Doylend, Philip Leicester, Becky Gough

Abstract:

The dynamic performance of all energy generation technologies is impacted to varying degrees by the stochastic properties of the wider system within which the generation technology is located. This stochasticity can include the varying nature of ambient renewable energy resources such as wind or solar radiation, or unpredicted changes in energy demand which impact upon the operational behaviour of thermal generation technologies. An understanding of these stochastic impacts are especially important in contexts such as highly distributed (or embedded) generation, where an understanding of issues affecting the individual or aggregated performance of high numbers of relatively small generators is especially important, such as in ESCO projects. Probabilistic evaluation of monitored or simulated performance data is one technique which can provide an insight into the dynamic performance characteristics of generating systems, both in a prognostic sense (such as the prediction of future performance at the project’s design stage) as well as in a diagnostic sense (such as in the real-time analysis of underperforming systems). In this work, we describe the development, application and outcomes of a new approach to the acquisition of datasets suitable for use in the subsequent performance and impact analysis (including the use of Bayesian approaches) for a number of distributed generation technologies. The application of the approach is illustrated using a number of case studies involving domestic and small commercial scale photovoltaic, solar thermal and natural gas boiler installations, and the results as presented show that the methodology offers significant advantages in terms of plant efficiency prediction or diagnosis, along with allied environmental and social impacts such as greenhouse gas emission reduction or fuel affordability.

Keywords: renewable energy, dynamic performance simulation, Bayesian analysis, distributed generation

Procedia PDF Downloads 495
4682 pH and Temperature Triggered Release of Doxorubicin from Hydogen Bonded Multilayer Films of Polyoxazolines

Authors: Meltem Haktaniyan, Eda Cagli, Irem Erel Goktepe

Abstract:

Polymers that change their properties in response to different stimuli (e.g. light, temperature, pH, ionic strength or magnetic field) are called ‘smart’ or ‘stimuli-responsive polymers’. These polymers have been widely used in biomedical applications such as sensors, gene delivery, drug delivery or tissue engineering. Temperature-responsive polymers have been studied extensively for controlled drug delivery applications. As regard of pseudo-peptides, poly (2-alky-2-oxazoline)s are considered as good candidates for delivery systems due to their stealth behavior and nontoxicity. In order to build responsive multilayer films for controlled drug release applications from surface, Layer by layer technique (LBL) is a powerful technique with an advantage of nanometer scale control over spatial architecture and morphology. Multilayers can be constructed on surface where non-covalent interactions including electrostatic interactions, hydrogen bonding, and charge-transfer or hydrophobic-hydrophobic interactions. In the present study, hydrogen bounded multilayer films of poly (2-alky-2-oxazoline) s with tannic acid were prepared in order to use as a platform to release Doxorubicin (DOX) from surface with pH and thermal triggers. For this purpose, poly (2-isopropyl-2-oxazoline) (PIPOX) and poly (2-ethyl-2-oxazoline) (PETOX) were synthesized via cationic ring opening polymerization (CROP) with hydroxyl end groups. Two polymeric multilayer systems ((PETOX)/(DOX)-(TA) complexes and (PIPOX)/(DOX)-(TA) complexes) were designed to investigate of controlled release of Doxorubicin (DOX) from surface with pH and thermal triggers. The drug release profiles from the multilayer thin films with alterations of pH and temperature will been examined with UV-Vis Spectroscopy and Fluorescence Spectroscopy.

Keywords: temperature responsive polymers, h-bonded multilayer films, drug release, polyoxazoline

Procedia PDF Downloads 308
4681 Restoration of Digital Design Using Row and Column Major Parsing Technique from the Old/Used Jacquard Punched Cards

Authors: R. Kumaravelu, S. Poornima, Sunil Kumar Kashyap

Abstract:

The optimized and digitalized restoration of the information from the old and used manual jacquard punched card in textile industry is referred to as Jacquard Punch Card (JPC) reader. In this paper, we present a novel design and development of photo electronics based system for reading old and used punched cards and storing its binary information for transforming them into an effective image file format. In our textile industry the jacquard punched cards holes diameters having the sizes of 3mm, 5mm and 5.5mm pitch. Before the adaptation of computing systems in the field of textile industry those punched cards were prepared manually without digital design source, but those punched cards are having rich woven designs. Now, the idea is to retrieve binary information from the jacquard punched cards and store them in digital (Non-Graphics) format before processing it. After processing the digital format (Non-Graphics) it is converted into an effective image file format through either by Row major or Column major parsing technique.To accomplish these activities, an embedded system based device and software integration is developed. As part of the test and trial activity the device was tested and installed for industrial service at Weavers Service Centre, Kanchipuram, Tamilnadu in India.

Keywords: file system, SPI. UART, ARM controller, jacquard, punched card, photo LED, photo diode

Procedia PDF Downloads 167
4680 Identification of Lipo-Alkaloids and Fatty Acids in Aconitum carmichaelii Using Liquid Chromatography–Mass Spectrometry and Gas Chromatography–Mass Spectrometry

Authors: Ying Liang, Na Li

Abstract:

Lipo-alkaloid is a kind of C19-norditerpenoid alkaloids existed in Aconitum species, which usually contains an aconitane skeleton and one or two fatty acid residues. The structures are very similar to that of diester-type alkaloids, which are considered as the main bioactive components in Aconitum carmichaelii. They have anti-inflammatory, anti-nociceptive, and anti-proliferative activities. So far, more than 200 lipo-alkaloids were reported from plants, semisynthesis, and biotransformations. In our research, by the combination of ultra-high performance liquid chromatography-quadruple-time of flight mass spectrometry (UHPLC-Q-TOF-MS) and an in-house database, 148 lipo-alkaloids were identified from A. carmichaelii, including 93 potential new compounds and 38 compounds with oxygenated fatty acid moieties. To our knowledge, this is the first time of the reporting of the oxygenated fatty acids as the side chains in naturally-occurring lipo-alkaloids. Considering the fatty acid residues in lipo-alkaloids should come from the free acids in the plant, the fatty acids and their relationship with lipo-alkaloids were further investigated by GC-MS and LC-MS. Among 17 fatty acids identified by GC-MS, 12 were detected as the side chains of lipo-alkaloids, which accounted for about 1/3 of total lipo-alkaloids, while these fatty acid residues were less than 1/4 of total fatty acid residues. And, total of 37 fatty acids were determined by UHPCL-Q-TOF-MS, including 18 oxidized fatty acids firstly identified from A. carmichaelii. These fatty acids were observed as the side chains of lipo-alkaloids. In addition, although over 140 lipo-alkaloids were identified, six lipo-alkaloids, 8-O-linoleoyl-14-benzoylmesaconine (1), 8-O-linoleoyl-14-benzoylaconine (2), 8-O-palmitoyl-14-benzoylmesaconine (3), 8-O-oleoyl-14-benzoylmesaconine (4), 8-O-pal-benzoylaconine (5), and 8-O-ole-Benzoylaconine (6), were found to be the main components, which accounted for over 90% content of total lipo-alkaloids. Therefore, using these six components as standards, a UHPLC-Triple Quadrupole-MS (UHPLC-QQQ-MS) approach was established to investigate the influence of processing on the contents of lipo-alkaloids. Although it was commonly supposed that the contents of lipo-alkaloids increased after processing, our research showed that no significant change was observed before and after processing. Using the same methods, the lipo-alkaloids in the lateral roots of A. carmichaelii and the roots of A. kusnezoffii were determined and quantified. The contents of lipo-alkaloids in A. kusnezoffii were close to that of the parent roots of A. carmichaelii, while the lateral roots had less lipo-alkaloids than the parent roots. This work was supported by Macao Science and Technology Development Fund (086/2013/A3 and 003/2016/A1).

Keywords: Aconitum carmichaelii, fatty acids, GC-MS, LC-MS, lipo-alkaloids

Procedia PDF Downloads 299
4679 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 188
4678 Application to Monitor the Citizens for Corona and Get Medical Aids or Assistance from Hospitals

Authors: Vathsala Kaluarachchi, Oshani Wimalarathna, Charith Vandebona, Gayani Chandrarathna, Lakmal Rupasinghe, Windhya Rankothge

Abstract:

It is the fundamental function of a monitoring system to allow users to collect and process data. A worldwide threat, the corona outbreak has wreaked havoc in Sri Lanka, and the situation has gotten out of hand. Since the epidemic, the Sri Lankan government has been unable to establish a systematic system for monitoring corona patients and providing emergency care in the event of an outbreak. Most patients have been held at home because of the high number of patients reported in the nation, but they do not yet have access to a functioning medical system. It has resulted in an increase in the number of patients who have been left untreated because of a lack of medical care. The absence of competent medical monitoring is the biggest cause of mortality for many people nowadays, according to our survey. As a result, a smartphone app for analyzing the patient's state and determining whether they should be hospitalized will be developed. Using the data supplied, we are aiming to send an alarm letter or SMS to the hospital once the system recognizes them. Since we know what those patients need and when they need it, we will put up a desktop program at the hospital to monitor their progress. Deep learning, image processing and application development, natural language processing, and blockchain management are some of the components of the research solution. The purpose of this research paper is to introduce a mechanism to connect hospitals and patients even when they are physically apart. Further data security and user-friendliness are enhanced through blockchain and NLP.

Keywords: blockchain, deep learning, NLP, monitoring system

Procedia PDF Downloads 133
4677 Common Used Non-Medical Practice and Perceived Benefits in Couples with Fertility Problems in Turkey

Authors: S. Fata, M. A. Tokat, N. Bagardi, B. Yilmaz

Abstract:

Nowadays, various traditional practices are used throughout the world with aim to improve fertility. Various traditional remedies, acupuncture, religious practices such as sacrifice are frequently used. Studies often evaluate the traditional practices used by the women. But the use of this non-medical practice by couples and specific application reasons of this methods has been less investigated. The aim of this study was to evaluate the common used non-medical practices and determine perceived benefits by couples with fertility problems in Turkey. This is a descriptive study. Research data were collected between May-July 2016, in Izmir Ege Birth Education and Research Hospital Assisted Reproduction Clinic, from 151 couples with fertility problem. Personal Information Form and Non-Medical Practices Used for Fertility Evaluation Form was used. Number 'GOA 2649' permission letter from Dokuz Eylul University Non-Invasive Research Ethics Board, permission letter from the institution and the written consent from participants has been received to carry out the study. In the evaluation of the data, frequencies and proportions analysis were used. The average age of women participating in the study was 32.87, the 35.8% were high school graduates, 60.3% were housewife and the 58.9% lived in city. The 30.5% of husbands were high school graduates, the 96.7% were employed and the 60.9% lived in city. The 78.1% of couples lived as a nuclear family, the average marriage year was 7.58, in 33.8% the fertility problem stems from women, 42.4% of them received a diagnosis for 1-2 years, 35.1% were being treated for 1-2 years. The 35.8% of women reported use of non-medical applications. The 24.4% of women used figs, onion cure, hacemat, locust, bee-pollen milk, the 18.2% used herbs, the 13.1% vowed, the 12.1% went to the tomb, the 10.1% did not bath a few days after the embryo transfer, the 9.1% used thermal water baths, the 5.0% manually corrected the womb, the 5.0% printed amulets by Hodja, the 3.0% went to the Hodja/pilgrims. Among the perceived benefits of using non-medical practices; facilitate pregnancy and implantation, improve oocyte quality were the most recently expressed. Women said that they often used herbs to develop follicles, did not bath after embryo transfer with aim to provide implantation, and used thermal waters to get rid of the infection. Compared to women, only the 25.8% of men used the non-medical practice. The 52.1% reported that they used peanuts, hacemat, locust, bee-pollen milk, the 14.9% used herbs, the 12.8% vowed, the 10.1% went to the tomb, the 10.1% used thermal water baths. Improve sperm number, motility and quality were the most expected benefits. Men said that they often used herbs to improve sperm number, used peanuts, hacemat, locust, bee-pollen milk to improve sperm motility and quality. Couples in Turkey often use non-medical practices to deal with fertility problems. Some of the practices considered as useful can adversely affect health. Healthcare providers should evaluate the use of non-medical practices and should inform if the application is known adverse effects on health.

Keywords: fertility, couples, non-medical practice, perceived benefit

Procedia PDF Downloads 342
4676 Revealing the Urban Heat Island: Investigating its Spatial and Temporal Changes and Relationship with Air Quality

Authors: Aneesh Mathew, Arunab K. S., Atul Kumar Sharma

Abstract:

The uncontrolled rise in population has led to unplanned, swift, and unsustainable urban expansion, causing detrimental environmental impacts on both local and global ecosystems. This research delves into a comprehensive examination of the Urban Heat Island (UHI) phenomenon in Bengaluru and Hyderabad, India. It centers on the spatial and temporal distribution of UHI and its correlation with air pollutants. Conducted across summer and winter seasons from 2001 to 2021 in Bangalore and Hyderabad, this study discovered that UHI intensity varies seasonally, peaking in summer and decreasing in winter. The annual maximum UHI intensities range between 4.65 °C to 6.69 °C in Bengaluru and 5.74 °C to 6.82 °C in Hyderabad. Bengaluru particularly experiences notable fluctuations in average UHI intensity. Introducing the Urban Thermal Field Variance Index (UTFVI), the study indicates a consistent strong UHI effect in both cities, significantly impacting living conditions. Moreover, hotspot analysis demonstrates a rising trend in UHI-affected areas over the years in Bengaluru and Hyderabad. This research underscores the connection between air pollutant concentrations and land surface temperature (LST), highlighting the necessity of comprehending UHI dynamics for urban environmental management and public health. It contributes to a deeper understanding of UHI patterns in swiftly urbanizing areas, providing insights into the intricate relationship between urbanization, climate, and air quality. These findings serve as crucial guidance for policymakers, urban planners, and researchers, facilitating the development of innovative, sustainable strategies to mitigate the adverse impacts of uncontrolled expansion while promoting the well-being of local communities and the global environment.

Keywords: urban heat island effect, land surface temperature, air pollution, urban thermal field variance index

Procedia PDF Downloads 80
4675 Learner's Difficulties Acquiring English: The Case of Native Speakers of Rio de La Plata Spanish Towards Justifying the Need for Corpora

Authors: Maria Zinnia Bardas Hoffmann

Abstract:

Contrastive Analysis (CA) is the systematic comparison between two languages. It stems from the notion that errors are caused by interference of the L1 system in the acquisition process of an L2. CA represents a useful tool to understand the nature of learning and acquisition. Also, this particular method promises a path to un-derstand the nature of underlying cognitive processes, even when other factors such as intrinsic motivation and teaching strategies were found to best explain student’s problems in acquisition. CA study is justified not only from the need to get a deeper understanding of the nature of SLA, but as an invaluable source to provide clues, at a cognitive level, for those general processes involved in rule formation and abstract thought. It is relevant for cross disciplinary studies and the fields of Computational Thought, Natural Language processing, Applied Linguistics, Cognitive Linguistics and Math Theory. That being said, this paper intends to address here as well its own set of constraints and limitations. Finally, this paper: (a) aims at identifying some of the difficulties students may find in their learning process due to the nature of their specific variety of L1, Rio de la Plata Spanish (RPS), (b) represents an attempt to discuss the necessity for specific models to approach CA.

Keywords: second language acquisition, applied linguistics, contrastive analysis, applied contrastive analysis English language department, meta-linguistic rules, cross-linguistics studies, computational thought, natural language processing

Procedia PDF Downloads 150
4674 Effect of Baffles on the Cooling of Electronic Components

Authors: O. Bendermel, C. Seladji, M. Khaouani

Abstract:

In this work, we made a numerical study of the thermal and dynamic behaviour of air in a horizontal channel with electronic components. The influence to use baffles on the profiles of velocity and temperature is discussed. The finite volume method and the algorithm Simple are used for solving the equations of conservation of mass, momentum and energy. The results found show that baffles improve heat transfer between the cooling air and electronic components. The velocity will increase from 3 times per rapport of the initial velocity.

Keywords: electronic components, baffles, cooling, fluids engineering

Procedia PDF Downloads 297
4673 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms

Authors: Hai L. Tran

Abstract:

When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.

Keywords: data, narrative, number, anecdote, storytelling, news

Procedia PDF Downloads 79
4672 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 371
4671 Study and Fine Characterization of the SS 316L Microstructures Obtained by Laser Beam Melting Process

Authors: Sebastien Relave, Christophe Desrayaud, Aurelien Vilani, Alexey Sova

Abstract:

Laser beam melting (LBM) is an additive manufacturing process that enables complex 3D parts to be designed. This process is now commonly employed for various applications such as chemistry or energy, requiring the use of stainless steel grades. LBM can offer comparable and sometimes superior mechanical properties to those of wrought materials. However, we observed an anisotropic microstructure which results from the process, caused by the very high thermal gradients along the building axis. This microstructure can be harmful depending on the application. For this reason, control and prediction of the microstructure are important to ensure the improvement and reproducibility of the mechanical properties. This study is focused on the 316L SS grade and aims at understanding the solidification and transformation mechanisms during process. Experiments to analyse the nucleation and growth of the microstructure obtained by the LBM process according to several conditions. These samples have been designed on different type of support bulk and lattice. Samples are produced on ProX DMP 200 LBM device. For the two conditions the analysis of microstructures, thanks to SEM and EBSD, revealed a single phase Austenite with preferential crystallite growth along the (100) plane. The microstructure was presented a hierarchical structure consisting columnar grains sizes in the range of 20-100 µm and sub grains structure of size 0.5 μm. These sub-grains were found in different shapes (columnar and cellular). This difference can be explained by a variation of the thermal gradient and cooling rate or element segregation while no sign of element segregation was found at the sub-grain boundaries. A high dislocation concentration was observed at sub-grain boundaries. These sub-grains are separated by very low misorientation walls ( < 2°) this causes a lattice of curvature inside large grain. A discussion is proposed on the occurrence of these microstructures formation, in regard of the LBM process conditions.

Keywords: selective laser melting, stainless steel, microstructure

Procedia PDF Downloads 157
4670 Simulation, Design, and 3D Print of Novel Highly Integrated TEG Device with Improved Thermal Energy Harvest Efficiency

Authors: Jaden Lu, Olivia Lu

Abstract:

Despite the remarkable advancement of solar cell technology, the challenge of optimizing total solar energy harvest efficiency persists, primarily due to significant heat loss. This excess heat not only diminishes solar panel output efficiency but also curtails its operational lifespan. A promising approach to address this issue is the conversion of surplus heat into electricity. In recent years, there is growing interest in the use of thermoelectric generators (TEG) as a potential solution. The integration of efficient TEG devices holds the promise of augmenting overall energy harvest efficiency while prolonging the longevity of solar panels. While certain research groups have proposed the integration of solar cells and TEG devices, a substantial gap between conceptualization and practical implementation remains, largely attributed to low thermal energy conversion efficiency of TEG devices. To bridge this gap and meet the requisites of practical application, a feasible strategy involves the incorporation of a substantial number of p-n junctions within a confined unit volume. However, the manufacturing of high-density TEG p-n junctions presents a formidable challenge. The prevalent solution often leads to large device sizes to accommodate enough p-n junctions, consequently complicating integration with solar cells. Recently, the adoption of 3D printing technology has emerged as a promising solution to address this challenge by fabricating high-density p-n arrays. Despite this, further developmental efforts are necessary. Presently, the primary focus is on the 3D printing of vertically layered TEG devices, wherein p-n junction density remains constrained by spatial limitations and the constraints of 3D printing techniques. This study proposes a novel device configuration featuring horizontally arrayed p-n junctions of Bi2Te3. The structural design of the device is subjected to simulation through the Finite Element Method (FEM) within COMSOL Multiphysics software. Various device configurations are simulated to identify optimal device structure. Based on the simulation results, a new TEG device is fabricated utilizing 3D Selective laser melting (SLM) printing technology. Fusion 360 facilitates the translation of the COMSOL device structure into a 3D print file. The horizontal design offers a unique advantage, enabling the fabrication of densely packed, three-dimensional p-n junction arrays. The fabrication process entails printing a singular row of horizontal p-n junctions using the 3D SLM printing technique in a single layer. Subsequently, successive rows of p-n junction arrays are printed within the same layer, interconnected by thermally conductive copper. This sequence is replicated across multiple layers, separated by thermal insulating glass. This integration created in a highly compact three-dimensional TEG device with high density p-n junctions. The fabricated TEG device is then attached to the bottom of the solar cell using thermal glue. The whole device is characterized, with output data closely matching with COMSOL simulation results. Future research endeavors will encompass the refinement of thermoelectric materials. This includes the advancement of high-resolution 3D printing techniques tailored to diverse thermoelectric materials, along with the optimization of material microstructures such as porosity and doping. The objective is to achieve an optimal and highly integrated PV-TEG device that can substantially increase the solar energy harvest efficiency.

Keywords: thermoelectric, finite element method, 3d print, energy conversion

Procedia PDF Downloads 62
4669 Sustainable Thermal Energy Storage Technologies: Enhancing Post-Harvest Drying Efficiency in Sub-Saharan Agriculture

Authors: Luís Miguel Estevão Cristóvão, Constâncio Augusto Machanguana, Fernando Chichango, Salvador Grande

Abstract:

Sub-Saharan African nations depend greatly on agriculture, a sector mainly marked by low production. Most of the farmers live in rural areas and employ basic labor-intensive technologies that lead to time inefficiencies and low overall effectiveness. Even with attempts to enhance farmers’ welfare through improved seeds and fertilizers, meaningful outcomes are yet to be achieved due to huge amounts of post-harvest losses. Such losses significantly endanger food security, economic stability, and result in unsustainable agricultural practices because more land, water, labor, energy, fertilizer, and other inputs must be used to produce more food. Drying, as a critical post-harvest process involving simultaneous heat and mass transfer, deserves attention. Among alternative green-energy sources, solar energy-based drying garners attention, particularly for small-scale farmers in remote communities. However, the intermittent nature of solar radiation poses challenges. To address this, energy storage solutions like rock-based thermal energy storage offer cost-effective solutions tailored to the needs of farmers. Methodologically, three solar dryers were constructed of metal, wood, and clay brick. Several tests were carried out with and without energy storage material. Notably, it has been demonstrated that soapstone stands out as a promising material due to its affordability and high specific energy capacity. By implementing these greener technologies, Sub-Saharan African countries could mitigate post-harvest losses, enhance food availability, improve nutrition, and promote sustainable resource utilization.

Keywords: energy storage, food security, post-harvest, solar dryer

Procedia PDF Downloads 24
4668 Analyzing the Performance of Different Cost-Based Methods for the Corrective Maintenance of a System in Thermal Power Plants

Authors: Demet Ozgur-Unluakin, Busenur Turkali, S. Caglar Aksezer

Abstract:

Since the age of industrialization, maintenance has always been a very crucial element for all kinds of factories and plants. With today’s increasingly developing technology, the system structure of such facilities has become more complicated, and even a small operational disruption may return huge losses in profits for the companies. In order to reduce these costs, effective maintenance planning is crucial, but at the same time, it is a difficult task because of the complexity of systems. The most important aspect of correct maintenance planning is to understand the structure of the system, not to ignore the dependencies among the components and as a result, to model the system correctly. In this way, it will be better to understand which component improves the system more when it is maintained. Undoubtedly, proactive maintenance at a scheduled time reduces costs because the scheduled maintenance prohibits high losses in profits. But the necessity of corrective maintenance, which directly affects the situation of the system and provides direct intervention when the system fails, should not be ignored. When a fault occurs in the system, if the problem is not solved immediately and proactive maintenance time is awaited, this may result in increased costs. This study proposes various maintenance methods with different efficiency measures under corrective maintenance strategy on a subsystem of a thermal power plant. To model the dependencies between the components, dynamic Bayesian Network approach is employed. The proposed maintenance methods aim to minimize the total maintenance cost in a planning horizon, as well as to find the most appropriate component to be attacked on, which improves the system reliability utmost. Performances of the methods are compared under corrective maintenance strategy. Furthermore, sensitivity analysis is also applied under different cost values. Results show that all fault effect methods perform better than the replacement effect methods and this conclusion is also valid under different downtime cost values.

Keywords: dynamic Bayesian networks, maintenance, multi-component systems, reliability

Procedia PDF Downloads 128