Search results for: cassava starch processing wastewater
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4777

Search results for: cassava starch processing wastewater

2917 Effects of Soaking of Maize on the Viscosity of Masa and Tortilla Physical Properties at Different Nixtamalization Times

Authors: Jorge Martínez-Rodríguez, Esther Pérez-Carrillo, Diana Laura Anchondo Álvarez, Julia Lucía Leal Villarreal, Mariana Juárez Dominguez, Luisa Fernanda Torres Hernández, Daniela Salinas Morales, Erick Heredia-Olea

Abstract:

Maize tortillas are a staple food in Mexico which are mostly made by nixtamalization, which includes the cooking and steeping of maize kernels in alkaline conditions. The cooking step in nixtamalization demands a lot of energy and also generates nejayote, a water pollutant, at the end of the process. The aim of this study was to reduce the cooking time by adding a maize soaking step before nixtamalization while maintaining the quality properties of masa and tortillas. Maize kernels were soaked for 36 h to increase moisture up to 36%. Then, the effect of different cooking times (0, 5, 10, 15, 20, 20, 25, 30, 35, 45-control and 50 minutes) was evaluated on viscosity profile (RVA) of masa to select the treatments with a profile similar or equal to control. All treatments were left steeping overnight and had the same milling conditions. Treatments selected were 20- and 25-min cooking times which had similar values for pasting temperature (79.23°C and 80.23°C), Maximum Viscosity (105.88 Cp and 96.25 Cp) and Final Viscosity (188.5 Cp and 174 Cp) to those of 45 min-control (77.65 °C, 110.08 Cp, and 186.70 Cp, respectively). Afterward, tortillas were produced with the chosen treatments (20 and 25 min) and for control, then were analyzed for texture, damage starch, colorimetry, thickness, and average diameter. Colorimetric analysis of tortillas only showed significant differences for yellow/blue coordinates (b* parameter) at 20 min (0.885), unlike the 25-minute treatment (1.122). Luminosity (L*) and red/green coordinates (a*) showed no significant differences from treatments with respect control (69.912 and 1.072, respectively); however, 25 minutes was closer in both parameters (73.390 and 1.122) than 20 minutes (74.08 and 0.884). For the color difference, (E), the 25 min value (3.84) was the most similar to the control. However, for tortilla thickness and diameter, the 20-minute with 1.57 mm and 13.12 cm respectively was closer to those of the control (1.69 mm and 13.86 cm) although smaller to it. On the other hand, the 25 min treatment tortilla was smaller than both 20 min and control with 1.51 mm thickness and 13.590 cm diameter. According to texture analyses, there was no difference in terms of stretchability (8.803-10.308 gf) and distance for the break (95.70-126.46 mm) among all treatments. However, for the breaking point, all treatments (317.1 gf and 276.5 gf for 25 and 20- min treatment, respectively) were significantly different from the control tortilla (392.2 gf). Results suggest that by adding a soaking step and reducing cooking time by 25 minutes, masa and tortillas obtained had similar functional and textural properties to the traditional nixtamalization process.

Keywords: tortilla, nixtamalization, corn, lime cooking, RVA, colorimetry, texture, masa rheology

Procedia PDF Downloads 154
2916 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic

Authors: Michael Lousis

Abstract:

The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.

Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors

Procedia PDF Downloads 299
2915 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models

Authors: Danielle Shackley, Yetunde Folajimi

Abstract:

As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.

Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model

Procedia PDF Downloads 82
2914 Development of the Integrated Quality Management System of Cooked Sausage Products

Authors: Liubov Lutsyshyn, Yaroslava Zhukova

Abstract:

Over the past twenty years, there has been a drastic change in the mode of nutrition in many countries which has been reflected in the development of new products, production techniques, and has also led to the expansion of sales markets for food products. Studies have shown that solution of the food safety problems is almost impossible without the active and systematic activity of organizations directly involved in the production, storage and sale of food products, as well as without management of end-to-end traceability and exchange of information. The aim of this research is development of the integrated system of the quality management and safety assurance based on the principles of HACCP, traceability and system approach with creation of an algorithm for the identification and monitoring of parameters of technological process of manufacture of cooked sausage products. Methodology of implementation of the integrated system based on the principles of HACCP, traceability and system approach during the manufacturing of cooked sausage products for effective provision for the defined properties of the finished product has been developed. As a result of the research evaluation technique and criteria of performance of the implementation and operation of the system of the quality management and safety assurance based on the principles of HACCP have been developed and substantiated. In the paper regularities of influence of the application of HACCP principles, traceability and system approach on parameters of quality and safety of the finished product have been revealed. In the study regularities in identification of critical control points have been determined. The algorithm of functioning of the integrated system of the quality management and safety assurance has also been described and key requirements for the development of software allowing the prediction of properties of finished product, as well as the timely correction of the technological process and traceability of manufacturing flows have been defined. Based on the obtained results typical scheme of the integrated system of the quality management and safety assurance based on HACCP principles with the elements of end-to-end traceability and system approach for manufacture of cooked sausage products has been developed. As a result of the studies quantitative criteria for evaluation of performance of the system of the quality management and safety assurance have been developed. A set of guidance documents for the implementation and evaluation of the integrated system based on the HACCP principles in meat processing plants have also been developed. On the basis of the research the effectiveness of application of continuous monitoring of the manufacturing process during the control on the identified critical control points have been revealed. The optimal number of critical control points in relation to the manufacture of cooked sausage products has been substantiated. The main results of the research have been appraised during 2013-2014 under the conditions of seven enterprises of the meat processing industry and have been implemented at JSC «Kyiv meat processing plant».

Keywords: cooked sausage products, HACCP, quality management, safety assurance

Procedia PDF Downloads 234
2913 Malachite Green and Red Congo Dyes Adsorption onto Chemical Treated Sewage Sludge

Authors: Zamouche Meriem, Mehcene Ismahan, Temmine Manel, Bencheikh Lehocine Mosaab, Meniai Abdeslam Hassen

Abstract:

In this study, the adsorption of Malachite Green (MG) by chemical treated sewage sludge has been studied. The sewage sludge, collected from drying beds of the municipal wastewater treatment station of IBN ZIED, Constantine, Algeria, was treated by different acids such us HNO₃, H₂SO₄, H₃PO₄ for modifying its aptitude to removal the MG from aqueous solutions. The results obtained shows that the sewage sludge activated by sulfuric acid give the highest elimination amounts of MG (9.52 mg/L) compared by the other acids used. The effects of operation parameters have been investigated, the results obtained show that the adsorption capacity per unit of adsorbent mass decreases from 18.69 to 1.20 mg/g when the mass of the adsorbent increases from 0.25 to 4 g respectively, the optimum mass for which a maximum of elimination of the dye is equal to 0.5g. The increasing in the temperature of the solution results in a slight decrease in the adsorption capacity of the chemically treated sludge. The highest amount of dye adsorbed by CSSS (9.56 mg/g) was observed for the optimum temperature of 25°C. The chemical activated sewage sludge proved its effectiveness for the removal of the Red Congo (RC), but by comparison the adsorption of the two dyes studies, we noted that the sludge has more affinity to adsorb the (MG).

Keywords: adsorption, chemical activation, malachite green, sewage sludge

Procedia PDF Downloads 178
2912 Efficient Photocatalytic Degradation of Tetracycline Hydrochloride Using Modified Carbon Nitride CCN/Bi₂WO₆ Heterojunction

Authors: Syed Najeeb-Uz-Zaman Haider, Yang Juan

Abstract:

Antibiotic overuse raises environmental concerns, boosting the demand for efficient removal from pharmaceutical wastewater. Photocatalysis, particularly using semiconductor photocatalysts, offers a promising solution and garners significant scientific interest. In this study, a Z-scheme 0.15BWO/CCN heterojunction was developed, analyzed, and employed for the photocatalytic degradation of tetracycline hydrochloride (TC) under visible light. The study revealed that the dosage of 0.15BWO@CCN and the presence of coexisting ions significantly influenced the degradation efficiency, achieving up to 87% within 20 minutes under optimal conditions (at pH 9-11/strongly basic conditions) while maintaining 84% efficiency under standard conditions (unaltered pH). Photoinduced electrons gathered on the conduction band of BWO while holes accumulated on the valence band of CCN, creating more favorable conditions to produce superoxide and hydroxyl radicals. Additionally, through comprehensive experimental analysis, the degradation pathway and mechanism were thoroughly explored. The superior photocatalytic performance of 0.15BWO@CCN was attributed to its Z-scheme heterojunction structure, which significantly reduced the recombination of photoinduced electrons and holes. The radicals produced were identified using ESR, and their involvement in tetracycline degradation was further analyzed through active species trapping experiments.

Keywords: CCN, Bi₂WO₆, TC, photocatalytic degradation, heterojunction

Procedia PDF Downloads 20
2911 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel

Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani

Abstract:

Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.

Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry

Procedia PDF Downloads 258
2910 Revolutionizing Healthcare Communication: The Transformative Role of Natural Language Processing and Artificial Intelligence

Authors: Halimat M. Ajose-Adeogun, Zaynab A. Bello

Abstract:

Artificial Intelligence (AI) and Natural Language Processing (NLP) have transformed computer language comprehension, allowing computers to comprehend spoken and written language with human-like cognition. NLP, a multidisciplinary area that combines rule-based linguistics, machine learning, and deep learning, enables computers to analyze and comprehend human language. NLP applications in medicine range from tackling issues in electronic health records (EHR) and psychiatry to improving diagnostic precision in orthopedic surgery and optimizing clinical procedures with novel technologies like chatbots. The technology shows promise in a variety of medical sectors, including quicker access to medical records, faster decision-making for healthcare personnel, diagnosing dysplasia in Barrett's esophagus, boosting radiology report quality, and so on. However, successful adoption requires training for healthcare workers, fostering a deep understanding of NLP components, and highlighting the significance of validation before actual application. Despite prevailing challenges, continuous multidisciplinary research and collaboration are critical for overcoming restrictions and paving the way for the revolutionary integration of NLP into medical practice. This integration has the potential to improve patient care, research outcomes, and administrative efficiency. The research methodology includes using NLP techniques for Sentiment Analysis and Emotion Recognition, such as evaluating text or audio data to determine the sentiment and emotional nuances communicated by users, which is essential for designing a responsive and sympathetic chatbot. Furthermore, the project includes the adoption of a Personalized Intervention strategy, in which chatbots are designed to personalize responses by merging NLP algorithms with specific user profiles, treatment history, and emotional states. The synergy between NLP and personalized medicine principles is critical for tailoring chatbot interactions to each user's demands and conditions, hence increasing the efficacy of mental health care. A detailed survey corroborated this synergy, revealing a remarkable 20% increase in patient satisfaction levels and a 30% reduction in workloads for healthcare practitioners. The poll, which focused on health outcomes and was administered to both patients and healthcare professionals, highlights the improved efficiency and favorable influence on the broader healthcare ecosystem.

Keywords: natural language processing, artificial intelligence, healthcare communication, electronic health records, patient care

Procedia PDF Downloads 58
2909 Improvement of Water Distillation Plant by Using Statistical Process Control System

Authors: Qasim Kriri, Harsh B. Desai

Abstract:

Water supply and sanitation in Saudi Arabia is portrayed by difficulties and accomplishments. One of the fundamental difficulties is water shortage. With a specific end goal to beat water shortage, significant ventures have been attempted in sea water desalination, water circulation, sewerage, and wastewater treatment. The motivation behind Statistical Process Control (SPC) is to decide whether the execution of a procedure is keeping up an acceptable quality level [AQL]. SPC is an analytical decision-making method. A fundamental apparatus in the SPC is the Control Charts, which follow the inconstancy in the estimations of the item quality attributes. By utilizing the suitable outline, administration can decide whether changes should be made with a specific end goal to keep the procedure in charge. The two most important quality factors in the distilled water which were taken into consideration were pH (Potential of Hydrogen) and TDS (Total Dissolved Solids). There were three stages at which the quality checks were done. The stages were as follows: (1) Water at the source, (2) water after chemical treatment & (3) water which is sent for packing. The upper specification limit, central limit and lower specification limit are taken as per Saudi water standards. The procedure capacity to accomplish the particulars set for the quality attributes of Berain water Factory chose to be focused by the proposed SPC system.

Keywords: acceptable quality level, statistical quality control, control charts, process charts

Procedia PDF Downloads 172
2908 An Analysis of Learners’ Reports for Measuring Co-Creational Education

Authors: Takatoshi Ishii, Koji Kimita, Keiichi Muramatsu, Yoshiki Shimomura

Abstract:

To increase the quality of learning, teacher and learner need mutual effort for realization of educational value. For this purpose, we need to manage the co-creational education among teacher and learners. In this research, we try to find a feature of co-creational education. To be more precise, we analyzed learners’ reports by natural language processing, and extract some features that describe the state of the co-creational education.

Keywords: co-creational education, e-portfolios, ICT integration, latent dirichlet allocation

Procedia PDF Downloads 603
2907 Carrot: A Possible Source of Multidrug-Resistant Acinetobacter Transmission

Authors: M. Dahiru, O. I. Enabulele

Abstract:

The research wish to investigate the occurrence of multidrug- resistant Acinetobacter, in carrot and estimate the role of carrot in its transmission, in a rapidly growing urban population. Thus, 50 carrot samples were collected from Jakara wastewater irrigation farms and analyzed on MacConkey agar and screened by Microbact 24E (Oxoid) and susceptibility of isolates tested against 10 commonly used antibiotics. Acinetobacter baumannii and A. lwoffii were isolated in 22.00% and 16% of samples respectively. Resistance to ceporex and penicillin of 36.36% and 27.27% in A. baumannii, and sensitivity to ofloxacin, pefloxacin, gentimycin and co-trimoxazole, were observed. However, for A. lwoffii apart from 37.50% resistance to ceporex, it was also resistant to all other drugs tested. There was a similarity in the resistant shown by A. baumannii and A. lwoffii to fluoroquinolones drugs and β- lactame drugs families in addition to between sulfonamide and animoglycoside demonstrated by A. lwoffii. Interestingly, when resistant similarities to different antibiotics were compared for A. baumannii and A. lwoffii as a whole, significant correlation was observed at P < 0.05 to CPX to NA (46.2%), and SXT to AU (52.6%) respectively, and high multi drug resistance (MDR) of 27.27% and 62.50% by A. baumannii and A. lwoffii respectively and overall MDR of 42.11% in all isolates. The occurrence of multidrug-resistance pathogen in carrot is a serious challenge to public health care, especially in a rapidly growing urban population where subsistence agriculture contributes greatly to urban livelihood and source of vegetables.

Keywords: urban agriculture, public health, fluoroquinolone, sulfonamide, multidrug-resistance

Procedia PDF Downloads 345
2906 An EBSD Investigation of Ti-6Al-4Nb Alloy Processed by Plan Strain Compression Test

Authors: Anna Jastrzebska, K. S. Suresh, T. Kitashima, Y. Yamabe-Mitarai, Z. Pakiela

Abstract:

Near α titanium alloys are important materials for aerospace applications, especially in high temperature applications such as jet engine. Mechanical properties of Ti alloys strongly depends on their processing route, then it is very important to understand micro-structure change by different processing. In our previous study, Nb was found to improve oxidation resistance of Ti alloys. In this study, micro-structure evolution of Ti-6Al-4Nb (wt %) alloy was investigated after plain strain compression test in hot working temperatures in the α and β phase region. High-resolution EBSD was successfully used for precise phase and texture characterization of this alloy. 1.1 kg of Ti-6Al-4Nb ingot was prepared using cold crucible levitation melting. The ingot was subsequently homogenized in 1050 deg.C for 1h followed by cooling in the air. Plate like specimens measuring 10×20×50 mm3 were cut from an ingot by electrical discharge machining (EDM). The plain strain compression test using an anvil with 10 x 35 mm in size was performed with 3 different strain rates: 0.1s-1, 1s-1and 10s-1 in 700 deg.C and 1050 deg.C to obtain 75% of deformation. The micro-structure was investigated by scanning electron microscopy (SEM) equipped with electron backscatter diffraction (EBSD) detector. The α/β phase ratio and phase morphology as well as the crystallographic texture, subgrain size, misorientation angles and misorientation gradients corresponding to each phase were determined over the middle and the edge of sample areas. The deformation mechanism in each working temperature was discussed. The evolution of texture changes with strain rate was investigated. The micro-structure obtained by plain strain compression test was heterogeneous with a wide range of grain sizes. This is because deformation and dynamic recrystallization occurred during deformation at temperature in the α and β phase. It was strongly influenced by strain rate.

Keywords: EBSD, plain strain compression test, Ti alloys

Procedia PDF Downloads 369
2905 Quality Analysis of Vegetables Through Image Processing

Authors: Abdul Khalique Baloch, Ali Okatan

Abstract:

The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.

Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria

Procedia PDF Downloads 54
2904 Effects of Fatty Acid Salts and Spices on Dermatophagoides farinae

Authors: Yumeho Obata, Mariko Era, Takayoshi Kawahara, Takahide Kanyama, Hiroshi Morita

Abstract:

Dermatophagoides farinae is major mite allergens in indoors. D. farinae is often swarm over powder products (e.g. wheat flour), because it feeds on starch or protein that are included in them. Eating powder products which are mixed D.farinae causes various allergic symptoms. Therefore, the creation of food additive agents with high safety and control of mite effect is required. Fatty acid salts and spices are known that have pesticidal activities. This study describes the effects of fatty acid salts and spices against Dermatophagoides farinae. Materials and Methods: Potassium salts of 9 fatty acids (C4:0, C6:0, C8:0, C10:0, C12:0, C14:0, C18:1, C18:2, C18:3) were prepared by mixing the fatty acid with the appropriate amount of KOH solution to a concentration of 175 mM and pH 10.5. C12Cu and C12Zn were selected as other fatty acid salts. Cayenne pepper, habanero, Japanese pepper, mustard, jalapeno pepper, curry aroma and cinnamon were selected as spices. D. farina, have been cultured in laboratory. To rear the mites, double-soled dishes containing of sterilized food were put on the big plastic container (30.0 × 20.0 × 20.0cm) which had 100% ammonium nitrate solution in the bottom. Plastic container was placed on incubator at 25 °C and 64 % relative humidity (RH) under dark condition. Sterilized food composed of dried bonito flakes and dried yeast (Ebios), 1:1 by weight. The antiproliferative method, sample and medium culture were mixed in double-soled dish and kept at 25 °C and 64 % RH. Decrease rates were determined 1 week and 4 week after treatment under microscope. D. farina was considered to be dead if appendages did not move when prodded with a pin. Results and Conclusions: The results show that the fatty acids potassium showed no antiproliferative effects against D. farinae. On the other hand, Japanese pepper, mustard, curry aroma and cinnamon were effective to decrease propagative rate (over 80 %) after treatment for 1 week against D. farina. Japanese pepper, curry aroma and cinnamon were effective to decrease propagative rate (approximately 100 %) after treatment for 4 weeks against D. farina. Especially, Japanese pepper and cinnamon showed the fasted and the most consecutive antiproliferative effects. These results indicate that Japanese pepper and cinnamon have high antiproliferative effects against D. farina and suggest spices will be used as a food additive agent.

Keywords: fatty acid salts, spices, antiproliferative effects, dermatophagoides farinae

Procedia PDF Downloads 215
2903 Deproteinization of Moroccan Sardine (Sardina pilchardus) Scales: A Pilot-Scale Study

Authors: F. Bellali, M. Kharroubi, Y. Rady, N. Bourhim

Abstract:

In Morocco, fish processing industry is an important source income for a large amount of by-products including skins, bones, heads, guts, and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Sardina plichardus scales from resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic, and biomedical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. And the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The advancement from lab scale to pilot scale is a critical stage in the technological development. In this study, the optimal condition for the deproteinization which was validated at laboratory scale was employed in the pilot scale procedure. The deproteinization of fish scale was then demonstrated on a pilot scale (2Kg scales, 20l NaOH), resulting in protein content (0,2mg/ml) and hydroxyproline content (2,11mg/l). These results indicated that the pilot-scale showed similar performances to those of lab-scale one.

Keywords: deproteinization, pilot scale, scale, sardine pilchardus

Procedia PDF Downloads 432
2902 Document-level Sentiment Analysis: An Exploratory Case Study of Low-resource Language Urdu

Authors: Ammarah Irum, Muhammad Ali Tahir

Abstract:

Document-level sentiment analysis in Urdu is a challenging Natural Language Processing (NLP) task due to the difficulty of working with lengthy texts in a language with constrained resources. Deep learning models, which are complex neural network architectures, are well-suited to text-based applications in addition to data formats like audio, image, and video. To investigate the potential of deep learning for Urdu sentiment analysis, we implemented five different deep learning models, including Bidirectional Long Short Term Memory (BiLSTM), Convolutional Neural Network (CNN), Convolutional Neural Network with Bidirectional Long Short Term Memory (CNN-BiLSTM), and Bidirectional Encoder Representation from Transformer (BERT). In this study, we developed a hybrid deep learning model called BiLSTM-Single Layer Multi Filter Convolutional Neural Network (BiLSTM-SLMFCNN) by fusing BiLSTM and CNN architecture. The proposed and baseline techniques are applied on Urdu Customer Support data set and IMDB Urdu movie review data set by using pre-trained Urdu word embedding that are suitable for sentiment analysis at the document level. Results of these techniques are evaluated and our proposed model outperforms all other deep learning techniques for Urdu sentiment analysis. BiLSTM-SLMFCNN outperformed the baseline deep learning models and achieved 83%, 79%, 83% and 94% accuracy on small, medium and large sized IMDB Urdu movie review data set and Urdu Customer Support data set respectively.

Keywords: urdu sentiment analysis, deep learning, natural language processing, opinion mining, low-resource language

Procedia PDF Downloads 52
2901 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 365
2900 Additive Manufacturing – Application to Next Generation Structured Packing (SpiroPak)

Authors: Biao Sun, Tejas Bhatelia, Vishnu Pareek, Ranjeet Utikar, Moses Tadé

Abstract:

Additive manufacturing (AM), commonly known as 3D printing, with the continuing advances in parallel processing and computational modeling, has created a paradigm shift (with significant radical thinking) in the design and operation of chemical processing plants, especially LNG plants. With the rising energy demands, environmental pressures, and economic challenges, there is a continuing industrial need for disruptive technologies such as AM, which possess capabilities that can drastically reduce the cost of manufacturing and operations of chemical processing plants in the future. However, the continuing challenge for 3D printing is its lack of adaptability in re-designing the process plant equipment coupled with the non-existent theory or models that could assist in selecting the optimal candidates out of the countless potential fabrications that are possible using AM. One of the most common packings used in the LNG process is structured packing in the packed column (which is a unit operation) in the process. In this work, we present an example of an optimum strategy for the application of AM to this important unit operation. Packed columns use a packing material through which the gas phase passes and comes into contact with the liquid phase flowing over the packing, typically performing the necessary mass transfer to enrich the products, etc. Structured packing consists of stacks of corrugated sheets, typically inclined between 40-70° from the plane. Computational Fluid Dynamics (CFD) was used to test and model various geometries to study the governing hydrodynamic characteristics. The results demonstrate that the costly iterative experimental process can be minimized. Furthermore, they also improve the understanding of the fundamental physics of the system at the multiscale level. SpiroPak, patented by Curtin University, represents an innovative structured packing solution currently at a technology readiness level (TRL) of 5~6. This packing exhibits remarkable characteristics, offering a substantial increase in surface area while significantly enhancing hydrodynamic and mass transfer performance. Recent studies have revealed that SpiroPak can reduce pressure drop by 50~70% compared to commonly used commercial packings, and it can achieve 20~50% greater mass transfer efficiency (particularly in CO2 absorption applications). The implementation of SpiroPak has the potential to reduce the overall size of columns and decrease power consumption, resulting in cost savings for both capital expenditure (CAPEX) and operational expenditure (OPEX) when applied to retrofitting existing systems or incorporated into new processes. Furthermore, pilot to large-scale tests is currently underway to further advance and refine this technology.

Keywords: Additive Manufacturing (AM), 3D printing, Computational Fluid Dynamics (CFD, structured packing (SpiroPak)

Procedia PDF Downloads 49
2899 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters

Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev

Abstract:

Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.

Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters

Procedia PDF Downloads 188
2898 Harnessing the Benefits and Mitigating the Challenges of Neurosensitivity for Learners: A Mixed Methods Study

Authors: Kaaryn Cater

Abstract:

People vary in how they perceive, process, and react to internal, external, social, and emotional environmental factors; some are more sensitive than others. Compassionate people have a highly reactive nervous system and are more impacted by positive and negative environmental conditions (Differential Susceptibility). Further, some sensitive individuals are disproportionately able to benefit from positive and supportive environments without necessarily suffering negative impacts in less supportive environments (Vantage Sensitivity). Environmental sensitivity is underpinned by physiological, genetic, and personality/temperamental factors, and the phenotypic expression of high sensitivity is Sensory Processing Sensitivity. The hallmarks of Sensory Processing Sensitivity are deep cognitive processing, emotional reactivity, high levels of empathy, noticing environmental subtleties, a tendency to observe new and novel situations, and a propensity to become overwhelmed when over-stimulated. Several educational advantages associated with high sensitivity include creativity, enhanced memory, divergent thinking, giftedness, and metacognitive monitoring. High sensitivity can also lead to some educational challenges, particularly managing multiple conflicting demands and negotiating low sensory thresholds. A mixed methods study was undertaken. In the first quantitative study, participants completed the Perceived Success in Study Survey (PSISS) and the Highly Sensitive Person Scale (HSPS-12). Inclusion criteria were current or previous postsecondary education experience. The survey was presented on social media, and snowball recruitment was employed (n=365). The Excel spreadsheets were uploaded to the statistical package for the social sciences (SPSS)26, and descriptive statistics found normal distribution. T-tests and analysis of variance (ANOVA) calculations found no difference in the responses of demographic groups, and Principal Components Analysis and the posthoc Tukey calculations identified positive associations between high sensitivity and three of the five PSISS factors. Further ANOVA calculations found positive associations between the PSISS and two of the three sensitivity subscales. This study included a response field to register interest in further research. Respondents who scored in the 70th percentile on the HSPS-12 were invited to participate in a semi-structured interview. Thirteen interviews were conducted remotely (12 female). Reflexive inductive thematic analysis was employed to analyse data, and a descriptive approach was employed to present data reflective of participant experience. The results of this study found that compassionate students prioritize work-life balance; employ a range of practical metacognitive study and self-care strategies; value independent learning; connect with learning that is meaningful; and are bothered by aspects of the physical learning environment, including lighting, noise, and indoor environmental pollutants. There is a dearth of research investigating sensitivity in the educational context, and these studies highlight the need to promote widespread education sector awareness of environmental sensitivity, and the need to include sensitivity in sector and institutional diversity and inclusion initiatives.

Keywords: differential susceptibility, highly sensitive person, learning, neurosensitivity, sensory processing sensitivity, vantage sensitivity

Procedia PDF Downloads 50
2897 Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence

Authors: Sylvester Akpah, Selasi Vondee

Abstract:

Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle.

Keywords: artificial ntelligence, chatbot, natural language processing, unmanned aerial vehicle

Procedia PDF Downloads 129
2896 Processing, Nutritional Assessment and Sensory Evaluation of Bakery Products Prepared from Orange Fleshed Sweet Potatoes (OFSP) and Wheat Composite Flours

Authors: Hategekimana Jean Paul, Irakoze Josiane, Ishimweyizerwe Valentin, Iradukunda Dieudonne, Uwanyirigira Jeannette

Abstract:

Orange fleshed sweet potatoes (OFSP) are highly grown and are available plenty in rural and urban local markets and its contribution in reduction of food insecurity in Rwanda is considerable. But the postharvest loss of this commodity is a critical challenge due to its high perishability. Several research activities have been conducted on how fresh food commodities can be transformed into extended shelf life food products for prevention of post-harvest losses. However, such activity was not yet well studied in Rwanda. The aim of the present study was the processing of backed products from (OFSP)combined with wheat composite flour and assess the nutritional content and consumer acceptability of new developed products. The perishability of OFSP and their related lack during off season can be eradicated by producing cake, doughnut and bread with OFSP puree or flour. The processing for doughnut and bread were made by making OFSP puree and other ingredients then a dough was made followed by frying and baking while for cake OFSP was dried through solar dryer to have a flour together with wheat flour and other ingredients to make dough cake and baking. For each product, one control and three experimental samples, (three products in three different ratios (30,40 and50%) of OFSP and the remaining percentage of wheat flour) were prepared. All samples including the control were analyzed for the consumer acceptability (sensory attributes). Most preferred samples (One sample for each product with its control sample and for each OFSP variety) were analyzed for nutritional composition along with control sample. The Cake from Terimbere variety and Bread from Gihingumukungu supplemented with 50% OFSP flour or Puree respectively were most acceptable except Doughnut from Vita variety which was highly accepted at 50% of OFSP supplementation. The moisture, ash, protein, fat, fiber, Total carbohydrate, Vitamin C, reducing sugar and minerals (Sodium, Potassium and Phosphorus.) content was different among products. Cake was rich in fibers (14.71%), protein (6.590%), and vitamin c(19.988mg/100g) compared to other samples while bread found to be rich in reducing sugar with 12.71mg/100g compared to cake and doughnut. Also doughnut was found to be rich in fat content with 6.89% compared to other samples. For sensory analysis, doughnut was highly accepted in ratio of 60:40 compared to other products while cake was least accepted at ratio of 50:50. The Proximate composition and minerals content of all the OFSP products were significantly higher as compared to the control samples.

Keywords: post-harvest loss, OFSP products, wheat flour, sensory evaluation, proximate composition

Procedia PDF Downloads 45
2895 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 205
2894 Signal Processing of the Blood Pressure and Characterization

Authors: Hadj Abd El Kader Benghenia, Fethi Bereksi Reguig

Abstract:

In clinical medicine, blood pressure, raised blood hemodynamic monitoring is rich pathophysiological information of cardiovascular system, of course described through factors such as: blood volume, arterial compliance and peripheral resistance. In this work, we are interested in analyzing these signals to propose a detection algorithm to delineate the different sequences and especially systolic blood pressure (SBP), diastolic blood pressure (DBP), and the wave and dicrotic to do their analysis in order to extract the cardiovascular parameters.

Keywords: blood pressure, SBP, DBP, detection algorithm

Procedia PDF Downloads 420
2893 Effect of Fluidized Granular Activated Carbon for the Mitigation of Membrane Fouling in Wastewater Treatment

Authors: Jingwei Wang, Anthony G. Fane, Jia Wei Chew

Abstract:

The use of fluidized Granular Activated Carbon (GAC) as a means of mitigation membrane fouling in membrane bioreactors (MBRs) has received much attention in recent years, especially in anaerobic fluidized bed membrane bioreactors (AFMBRs). It has been affirmed that the unsteady-state tangential shear conferred by GAC fluidization on membrane surface suppressed the extent of membrane fouling with energy consumption much lower than that of bubbling (i.e., air sparging). In a previous work, the hydrodynamics of the fluidized GAC particles were correlated with membrane fouling mitigation effectiveness. Results verified that the momentum transfer from particle to membrane held a key in fouling mitigation. The goal of the current work is to understand the effect of fluidized GAC on membrane critical flux. Membrane critical flux values were measured by a vertical Direct Observation Through the Membrane (DOTM) setup. The polystyrene particles (known as latex particles) with the particle size of 5 µm were used as model foulant thus to give the number of the foulant on the membrane surface. Our results shed light on the positive effect of fluidized GAC enhancing the critical membrane flux by an order-of-magnitude as compared to that of liquid shear alone. Membrane fouling mitigation was benefitted by the increasing of power input.

Keywords: membrane fouling mitigation, liquid-solid fluidization, critical flux, energy input

Procedia PDF Downloads 390
2892 Survey of Communication Technologies for IoT Deployments in Developing Regions

Authors: Namugenyi Ephrance Eunice, Julianne Sansa Otim, Marco Zennaro, Stephen D. Wolthusen

Abstract:

The Internet of Things (IoT) is a network of connected data processing devices, mechanical and digital machinery, items, animals, or people that may send data across a network without requiring human-to-human or human-to-computer interaction. Each component has sensors that can pick up on specific phenomena, as well as processing software and other technologies that can link to and communicate with other systems and/or devices over the Internet or other communication networks and exchange data with them. IoT is increasingly being used in fields other than consumer electronics, such as public safety, emergency response, industrial automation, autonomous vehicles, the Internet of Medical Things (IoMT), and general environmental monitoring. Consumer-based IoT applications, like smart home gadgets and wearables, are also becoming more prevalent. This paper presents the main IoT deployment areas for environmental monitoring in developing regions and the backhaul options suitable for them. A detailed review of each of the list of papers selected for the study is included in section III of this document. The study includes an overview of existing IoT deployments, the underlying communication architectures, protocols, and technologies that support them. This overview shows that Low Power Wireless Area Networks (LPWANs), as summarized in Table 1, are very well suited for monitoring environment architectures designed for remote locations. LoRa technology, particularly the LoRaWAN protocol, has an advantage over other technologies due to its low power consumption, adaptability, and suitable communication range. The prevailing challenges of the different architectures are discussed and summarized in Table 3 of the IV section, where the main problem is the obstruction of communication paths by buildings, trees, hills, etc.

Keywords: communication technologies, environmental monitoring, Internet of Things, IoT deployment challenges

Procedia PDF Downloads 68
2891 Effect of Processing Parameters on the Physical Properties of Pineapple Pomace Based Aquafeed

Authors: Oluwafemi Babatunde Oduntan, Isaac A. Bamgboye

Abstract:

The solid waste disposal and its management from pineapple juice processing constitute environmental contamination affecting public health. The use of this by-product called pomace has potentials to reduce cost of aquafeed. Pineapple pomace collected after juice extraction was dried and milled. The interactive effects of feeding rate (1.28, 1.44 and 1.60kg/min), screw speed (305, 355 and 405rpm), moisture content (16, 19 and 22%), temperatures (60, 80, 100 and 120°C), cutting speed (1300, 1400 and 1500rpm), pomace inclusion ratio (5, 10, 15, 20%) and open surface die (50, 75 and 100%) on the extrudate physical properties (bulk density, unit density, expansion ratio, durability and floatability) were investigated using optimal custom design (OCD) matrix and response surface methodology. The predicted values were found to be in good agreement with the experimental values for, expansion ratio, durability and floatability (R2 = 0.7970; 0.9264; 0.9098 respectively) with the exceptions of unit density and bulk density (R2 = 0.1639; 0.2768 respectively). All the extrudates showed relatively high floatability, durability. The inclusion of pineapple pomace produced less expanded and more compact textured extrudates. Results indicated that increased in the value of pineapple pomace, screw speed, feeding rate decreased unit density, bulk density, expansion ratio, durability and floatability of the extrudate. However, increasing moisture content of feed mash resulted in increase unit density and bulk density. Addition of extrusion temperature and cutting speed increased the floatability and durability of extrudate. The proportion of pineapple pomace in aquafeed extruded product was observed to have significantly lower effect on the selected responses.

Keywords: aquafeed, extrusion, physical properties, pineapple pomace, waste

Procedia PDF Downloads 255
2890 Evaluation of Lead II Adsorption in Porous Structures Manufactured from Chitosan, Hydroxiapatite and Moringa

Authors: Mishell Vaca, Gema Gonzales, Francisco Quiroz

Abstract:

Heavy metals present in wastewater constitute a danger for living beings in general. In Ecuador, one of the sources of contamination is artisanal mining whose liquid effluents, in many of the cases without prior treatment, are discharged to the surrounding rivers. Lead is a pollutant that accumulated in the body causes severe health effects. Nowadays, there are several treatment methods to reduce this pollutant. The aim of this study is to reduce the concentration of lead II through the use of a porous material formed by a matrix of chitosan, in which hydroxyapatite and moringa particles smaller than 53 um are suspended. These materials are not toxic to the environment, and each one adsorbs metals independently, so the synergic effect between them will be evaluated. The synthesized material has a cylindrical design that allows increasing the surface area, which is expected to have greater capacity of adsorption. It has been determined that the best conditions for its preparation are to dissolve the chitosan in 1% v/v acetic acid with a pH = 5, then the hydroxyapatite and moringa are added to the mixture with magnetic stirring. This suspension is frozen, lyophilized and finally dried. In order to evaluate the performance of the synthesized material, synthetic solutions of lead are prepared at different concentrations, and the percentage of removal is evaluated. It is expected to have an effluent whose lead content is less than 0.2 mg/L which is the limit maximum allowable according to established environmental standards.

Keywords: adsorption, chitosan, hydroxyapatite, lead, moringa, water treatment

Procedia PDF Downloads 148
2889 Produce Large Surface Area Activated Carbon from Biomass for Water Treatment

Authors: Rashad Al-Gaashani

Abstract:

The physicochemical activation method was used to produce high-quality activated carbon (AC) with a large surface area of about 2000 m2/g from low-cost and abundant biomass wastes in Qatar, namely date seeds. X-Ray diffraction (XRD), scanning electron spectroscopy (SEM), energy dispersive X-Ray spectroscopy (EDS), and Brunauer-Emmett-Teller (BET) surface area analysis was used to evaluate the AC samples. AC produced from date seeds has a wide range of pores available, including micro- and nano-pores. This type of AC with a well-developed pore structure may be very attractive for different applications, including air and water purification from micro and nano pollutants. Heavy metals iron (III) and copper (II) ions were removed from wastewater using the AC produced using a batch adsorption technique. The AC produced from date seeds biomass wastes shows high removal of heavy metals such as iron (III) ions (100%) and copper (II) ions (97.25%). The highest removal of copper (II) ions (100%) with AC produced from date seeds was found at pH 8, whereas the lowest removal (22.63%) occurred at pH 2. The effect of adsorption time, adsorbent dose, and pH on the removal of heavy metals was studied.

Keywords: activated carbon, date seeds, biomass, heavy metals removal, water treatment

Procedia PDF Downloads 62
2888 Valonea Tannin Supported AgCl/ZnO/Fe3O4 Nanocomposite, a Magnetically Separable Photocatalyst with Enhanced Photocatalytic Performance under Visible Light Irradiation

Authors: Nuray Güy, Mahmut Özacar

Abstract:

In the past few decades, considerable attention has been devoted to the photocatalysts for the photocatalytic degradation of environmental pollutants. Many novel nanostructured photocatalysts for wastewater treatment have been investigated, such as TiO2 and, CdS, ZnO and silver halides (AgX, X = Cl, Br, I). The silver halides are photosensitive materials which can absorb photons in the visible region to produce electron–hole pairs. Silver halides are expensive that restricts their applications in large-scale photocatalytic processes. Tannin contains hydroxyl functional groups, it was employed as a modifier to improve the surface properties and adsorption capacity of the activated carbon towards the metal cations uptake. In this work, we designed a new structure of magnetically separable photocatalyst that combines AgCl/ZnO nanoparticles with Fe3O4 nanoparticles deposited on tannin, which was denoted as (AgI/ZnO)-Fe3O4/Tannin. The as-prepared products are characterized by X-ray diffraction (XRD), field emission scanning electron microscope (FESEM), Fourier transform infrared (FTIR), diffuse reflectance spectra (DRS) and vibrating sample magnetometer (VSM). The photocatalyst exhibited high activity degrading a textile dye under visible light irradiation. Moreover, the excellent magnetic property gives a more convenient way to recycle the photocatalysts.

Keywords: AgI/ZnO-Fe3O4/Tannin, visible light, magnetically separable, photocatalyst

Procedia PDF Downloads 202