Search results for: forest products pricing
3311 Identification and Evaluation of Landscape Mosaics of Kutlubeyyazıcılar Campus, Bartın University, Turkey
Authors: Y. Sarı Nayim, B. N. Nayim
Abstract:
This research proposal includes the defining and evaluation of the semi-natural and cultural ecosystems at Bartın University main campus in Turkey in terms of landscape mosaics. The ecosystem mosaic of the main campus was divided into zones based on ecological classification technique. Based on the results from the study, it was found that 6 different ecosystem mosaics should be used as a base in the planning and design of the existing and future landscape planning of Kutlubeyyazıcılar campus. The first landscape zone involves the 'social areas'. These areas include yards, dining areas, recreational areas and lawn areas. The second landscape zone is 'main vehicle and pedestrian areas'. These areas include vehicle access to the campus landscape, moving in the campus with vehicles, parking and pedestrian walk ways. The third zone is 'landscape areas with high visual landscape quality'. These areas will be the places where attractive structural and plant landscape elements will be used. Fourth zone will be 'landscapes of building borders and their surroundings.' The fifth and important zone that should be survived in the future is 'Actual semi-natural forest and bush areas'. And the last zone is 'water landscape' which brings ecological value to landscape areas. While determining the most convenient areas in the planning and design of the campus, these landscape mosaics should be taken into consideration. This zoning will ensure that the campus landscape is protected and living spaces in the campus apart from the areas where human activities are carried out will be used properly.Keywords: campus landscape planning and design, landscape ecology, landscape mosaics, Bartın
Procedia PDF Downloads 3663310 Phytoremediation of Pharmaceutical Emerging Contaminant-Laden Wastewater: A Techno-Economic and Sustainable Development Approach
Authors: Reda A. Elkhyat, Mahmoud Nasr, Amel A. Tammam, Mohamed A. Ghazy
Abstract:
Pharmaceuticals and personal care products (PPCPs) are a unique group of emerging contaminants continuously introduced into the aquatic ecosystem at concentrations capable of inducing adverse effects on humans and aquatic organisms, even at trace levels ranging from ppt to ppm. Amongst the common pharmaceutical emerging pollutants detected in several aquatic environments, acetaminophen has been recognized for its high toxicity. Once released into the aquatic environment, acetaminophen could be degraded by the microbial community and adsorption/ uptake by the plants. Although many studies have investigated the hazard risks of acetaminophen pollutants on aquatic animals, the number of studies demonstrating its removal efficiency and effects on the aquatic plant still needs to be expanded. In this context, this study aims to apply the aquatic plant-based phytoremediation system to eliminate this emerging contaminant from domestic wastewater. The phytoremediation experiment was performed in a hydroponic system containing Eichhornia crassipes and operated under the natural environment at 25°C to 30°C. This system was subjected to synthetic domestic wastewater with the maximum initial chemical oxygen demand (COD) of 390 mg/L and three different acetaminophen concentrations of 25, 50, and 200 mg/L. After 17 d of operation, the phytoremediation system achieved removal efficiencies of about 100% and 85.6±4.2% for acetaminophen and COD, respectively.Moreover, the Eichhornia crassipes could withstand the toxicity associated with increasing the acetaminophen concentrations from 25 to 200 mg/L. This high treatment performance could be assigned to the well-adaptation of the water hyacinth to the phytoremediation factors. Moreover, it has been proposed that this phytoremediation system could be largely supported by phytodegradation and plant uptaking mechanisms; however, detecting the generated intermediates, metabolites, and degradation products are still under investigation. Applying this free-floating plant in wastewater treatment and reducing emerging contaminants would meet the targets of SDGs 3, 6, and. 14. The cost-benefit analysis was performed for the phytoremediation system. The phytoremediation system is financially viable as the net profit was 2921 US $/ y with a payback period of nine years.Keywords: domestic wastewater, emerging pollutants, hydrophyte Eichhornia crassipes, paracetamol removal efficiency, sustainable development goals (SDGs)
Procedia PDF Downloads 1153309 Study of Rehydration Process of Dried Squash (Cucurbita pepo) at Different Temperatures and Dry Matter-Water Ratios
Authors: Sima Cheraghi Dehdezi, Nasser Hamdami
Abstract:
Air-drying is the most widely employed method for preserving fruits and vegetables. Most of the dried products must be rehydrated by immersion in water prior to their use, so the study of rehydration kinetics in order to optimize rehydration phenomenon has great importance. Rehydration typically composes of three simultaneous processes: the imbibition of water into dried material, the swelling of the rehydrated products and the leaching of soluble solids to rehydration medium. In this research, squash (Cucurbita pepo) fruits were cut into 0.4 cm thick and 4 cm diameter slices. Then, squash slices were blanched in a steam chamber for 4 min. After cooling to room temperature, squash slices were dehydrated in a hot air dryer, under air flow 1.5 m/s and air temperature of 60°C up to moisture content of 0.1065 kg H2O per kg d.m. Dehydrated samples were kept in polyethylene bags and stored at 4°C. Squash slices with specified weight were rehydrated by immersion in distilled water at different temperatures (25, 50, and 75°C), various dry matter-water ratios (1:25, 1:50, and 1:100), which was agitated at 100 rpm. At specified time intervals, up to 300 min, the squash samples were removed from the water, and the weight, moisture content and rehydration indices of the sample were determined.The texture characteristics were examined over a 180 min period. The results showed that rehydration time and temperature had significant effects on moisture content, water absorption capacity (WAC), dry matter holding capacity (DHC), rehydration ability (RA), maximum force and stress in dried squash slices. Dry matter-water ratio had significant effect (p˂0.01) on all squash slice properties except DHC. Moisture content, WAC and RA of squash slices increased, whereas DHC and texture firmness (maximum force and stress) decreased with rehydration time. The maximum moisture content, WAC and RA and the minimum DHC, force and stress, were observed in squash slices rehydrated into 75°C water. The lowest moisture content, WAC and RA and the highest DHC, force and stress, were observed in squash slices immersed in water at 1:100 dry matter-water ratio. In general, for all rehydration conditions of squash slices, the highest water absorption rate occurred during the first minutes of process. Then, this rate decreased. The highest rehydration rate and amount of water absorption occurred in 75°C.Keywords: dry matter-water ratio, squash, maximum force, rehydration ability
Procedia PDF Downloads 3133308 The Methods of Customer Satisfaction Measurement and Its Statistical Analysis towards Sales and Logistic Activities in Food Sector
Authors: Seher Arslankaya, Bahar Uludağ
Abstract:
Meeting the needs and demands of customers and pleasing the customers are important requirements for companies in food sectors where the growth of competition is significantly unpredictable. Customer satisfaction is also one of the key concepts which is mainly driven by wide range of customer preference and expectation upon products and services introduced and delivered to them. In order to meet the customer demands, the companies that engage in food sectors are expected to have a well-managed set of Total Quality Management (TQM), which sets out to improve quality of products and services; to reduce costs and to increase customer satisfaction by restructuring traditional management practices. It aims to increase customer satisfaction by meeting (their) customer expectations and requirements. The achievement would be determined with the help of customer satisfaction surveys, which is done to obtain immediate feedback and to provide quick responses. In addition, the surveys would also assist the making of strategic planning which helps to anticipate customer future needs and expectations. Meanwhile, periodic measurement of customer satisfaction would be a must because with the better understanding of customers perceptions from the surveys (done by questioners), the companies would have a clear idea to identify their own strengths and weaknesses that help the companies keep their loyal customers; to stand in comparison toward their competitors and map out their future progress and improvement. In this study, we propose a survey based on customer satisfaction measurement method and its statistical analysis for sales and logistic activities of food firms. Customer satisfaction would be discussed in details. Furthermore, after analysing the data derived from the questionnaire that applied to customers by using the SPSS software, various results obtained from the application would be presented. By also applying ANOVA test, the study would analysis the existence of meaningful differences between customer demographic proportion and their perceptions. The purpose of this study is also to find out requirements which help to remove the effects that decrease customer satisfaction and produce loyal customers in food industry. For this purpose, the customer complaints are collected. Additionally, comments and suggestions are done according to the obtained results of surveys, which would be useful for the making-process of strategic planning in food industry.Keywords: customer satisfaction measurement and analysis, food industry, SPSS, TQM
Procedia PDF Downloads 2503307 The Importance of Intellectual Property for Universities of Technology in South Africa: Challenges Faced and Proposed Way Forward
Authors: Martha E. Ikome, John M. Ikome
Abstract:
Intellectual property should be a day-to-day business decision due to its value, but increasingly, a number of institution are still not aware of the importance. Intellectual Property (IP) and its value are often not adequately appreciated. In the increasingly knowledge-driven economy, IP is a key consideration in day-to-day business decisions because new ideas and products appear almost daily in the market, which results in continuous innovation and research. Therefore, this paper will focus on the importance of IP for universities of technology and also further demonstrates how IP can become an economic tool and the challenges faced by these universities in implementing an IP system.Keywords: intellectual property, institutions, challenges, protection
Procedia PDF Downloads 3753306 Life Cycle Assessment of Biogas Energy Production from a Small-Scale Wastewater Treatment Plant in Central Mexico
Authors: Joel Bonales, Venecia Solorzano, Carlos Garcia
Abstract:
A great percentage of the wastewater generated in developing countries don’t receive any treatment, which leads to numerous environmental impacts. In response to this, a paradigm change in the current wastewater treatment model based on large scale plants towards a small and medium scale based model has been proposed. Nevertheless, small scale wastewater treatment (SS-WTTP) with novel technologies such as anaerobic digesters, as well as the utilization of derivative co-products such as biogas, still presents diverse environmental impacts which must be assessed. This study consisted in a Life Cycle Assessment (LCA) performed to a SS-WWTP which treats wastewater from a small commercial block in the city of Morelia, Mexico. The treatment performed in the SS-WWTP consists in anaerobic and aerobic digesters with a daily capacity of 5,040 L. Two different scenarios were analyzed: the current plant conditions and a hypothetical energy use of biogas obtained in situ. Furthermore, two different allocation criteria were applied: full impact allocation to the system’s main product (treated water) and substitution credits for replacing Mexican grid electricity (biogas) and clean water pumping (treated water). The results showed that the analyzed plant had bigger impacts than what has been reported in the bibliography in the basis of wastewater volume treated, which may imply that this plant is currently operating inefficiently. The evaluated impacts appeared to be focused in the aerobic digestion and electric generation phases due to the plant’s particular configuration. Additional findings prove that the allocation criteria applied is crucial for the interpretation of impacts and that that the energy use of the biogas obtained in this plant can help mitigate associated climate change impacts. It is concluded that SS-WTTP is a environmentally sound alternative for wastewater treatment from a systemic perspective. However, this type of studies must be careful in the selection of the allocation criteria and replaced products, since these factors have a great influence in the results of the assessment.Keywords: biogas, life cycle assessment, small scale treatment, wastewater treatment
Procedia PDF Downloads 1243305 Machine Learning Based Approach for Measuring Promotion Effectiveness in Multiple Parallel Promotions’ Scenarios
Authors: Revoti Prasad Bora, Nikita Katyal
Abstract:
Promotion is a key element in the retail business. Thus, analysis of promotions to quantify their effectiveness in terms of Revenue and/or Margin is an essential activity in the retail industry. However, measuring the sales/revenue uplift is based on estimations, as the actual sales/revenue without the promotion is not present. Further, the presence of Halo and Cannibalization in a multiple parallel promotions’ scenario complicates the problem. Calculating Baseline by considering inter-brand/competitor items or using Halo and Cannibalization's impact on Revenue calculations by considering Baseline as an interpretation of items’ unit sales in neighboring nonpromotional weeks individually may not capture the overall Revenue uplift in the case of multiple parallel promotions. Hence, this paper proposes a Machine Learning based method for calculating the Revenue uplift by considering the Halo and Cannibalization impact on the Baseline and the Revenue. In the first section of the proposed methodology, Baseline of an item is calculated by incorporating the impact of the promotions on its related items. In the later section, the Revenue of an item is calculated by considering both Halo and Cannibalization impacts. Hence, this methodology enables correct calculation of the overall Revenue uplift due a given promotion.Keywords: Halo, Cannibalization, promotion, Baseline, temporary price reduction, retail, elasticity, cross price elasticity, machine learning, random forest, linear regression
Procedia PDF Downloads 1783304 Analysis of the Production Time in a Pharmaceutical Company
Authors: Hanen Khanchel, Karim Ben Kahla
Abstract:
Pharmaceutical companies are facing competition. Indeed, the price differences between competing products can be such that it becomes difficult to compensate them by differences in value added. The conditions of competition are no longer homogeneous for the players involved. The price of a product is a given that puts a company and its customer face to face. However, price fixing obliges the company to consider internal factors relating to production costs and external factors such as customer attitudes, the existence of regulations and the structure of the market on which the firm evolved. In setting the selling price, the company must first take into account internal factors relating to its costs: costs of production fall into two categories, fixed costs and variable costs that depend on the quantities produced. The company cannot consider selling below what it costs the product. It, therefore, calculates the unit cost of production to which it adds the unit cost of distribution, enabling it to know the unit cost of production of the product. The company adds its margin and thus determines its selling price. The margin is used to remunerate the capital providers and to finance the activity of the company and its investments. Production costs are related to the quantities produced: large-scale production generally reduces the unit cost of production, which is an asset for companies with mass production markets. This shows that small and medium-sized companies with limited market segments need to make greater efforts to ensure their profit margins. As a result, and faced with high and low market prices for raw materials and increasing staff costs, the company must seek to optimize its production time in order to reduce loads and eliminate waste. Then, the customer pays only value added. Thus, and based on this principle we decided to create a project that deals with the problem of waste in our company, and having as objectives the reduction of production costs and improvement of performance indicators. This paper presents the implementation of the Value Stream Mapping (VSM) project in a pharmaceutical company. It is structured as follows: 1) determination of the family of products, 2) drawing of the current state, 3) drawing of the future state, 4) action plan and implementation.Keywords: VSM, waste, production time, kaizen, cartography, improvement
Procedia PDF Downloads 1513303 Logistic Regression Based Model for Predicting Students’ Academic Performance in Higher Institutions
Authors: Emmanuel Osaze Oshoiribhor, Adetokunbo MacGregor John-Otumu
Abstract:
In recent years, there has been a desire to forecast student academic achievement prior to graduation. This is to help them improve their grades, particularly for individuals with poor performance. The goal of this study is to employ supervised learning techniques to construct a predictive model for student academic achievement. Many academics have already constructed models that predict student academic achievement based on factors such as smoking, demography, culture, social media, parent educational background, parent finances, and family background, to name a few. This feature and the model employed may not have correctly classified the students in terms of their academic performance. This model is built using a logistic regression classifier with basic features such as the previous semester's course score, attendance to class, class participation, and the total number of course materials or resources the student is able to cover per semester as a prerequisite to predict if the student will perform well in future on related courses. The model outperformed other classifiers such as Naive bayes, Support vector machine (SVM), Decision Tree, Random forest, and Adaboost, returning a 96.7% accuracy. This model is available as a desktop application, allowing both instructors and students to benefit from user-friendly interfaces for predicting student academic achievement. As a result, it is recommended that both students and professors use this tool to better forecast outcomes.Keywords: artificial intelligence, ML, logistic regression, performance, prediction
Procedia PDF Downloads 973302 Interaction between Breathiness and Nasality: An Acoustic Analysis
Authors: Pamir Gogoi, Ratree Wayland
Abstract:
This study investigates the acoustic measures of breathiness when coarticulated with nasality. The acoustic correlates of breathiness and nasality that has already been well established after years of empirical research. Some of these acoustic parameters - like low frequency peaks and wider bandwidths- are common for both nasal and breathy voice. Therefore, it is likely that these parameters interact when a sound is coarticulated with breathiness and nasality. This leads to the hypothesis that the acoustic parameters, which usually act as robust cues in differentiating between breathy and modal voice, might not be reliable cues for differentiating between breathy and modal voice when breathiness is coarticulated with nasality. The effect of nasality on the perception of breathiness has been explored in earlier studies using synthesized speech. The results showed that perceptually, nasality and breathiness do interact. The current study investigates if a similar pattern is observed in natural speech. The study is conducted on Marathi, an Indo-Aryan language which has a three-way contrast between nasality and breathiness. That is, there is a phonemic distinction between nasals, breathy voice and breathy-nasals. Voice quality parameters like – H1-H2 (Difference between the amplitude of first and second harmonic), H1-A3 (Difference between the amplitude of first harmonic and third formant, CPP (Cepstral Peak Prominence), HNR (Harmonics to Noise ratio) and B1 (Bandwidth of first formant) were extracted. Statistical models like linear mixed effects regression and Random Forest classifiers show that measures that capture the noise component in the signal- like CPP and HNR- can classify breathy voice from modal voice better than spectral measures when breathy voice is coarticulated with nasality.Keywords: breathiness, marathi, nasality, voice quality
Procedia PDF Downloads 963301 Deep Learning-Based Automated Structure Deterioration Detection for Building Structures: A Technological Advancement for Ensuring Structural Integrity
Authors: Kavita Bodke
Abstract:
Structural health monitoring (SHM) is experiencing growth, necessitating the development of distinct methodologies to address its expanding scope effectively. In this study, we developed automatic structure damage identification, which incorporates three unique types of a building’s structural integrity. The first pertains to the presence of fractures within the structure, the second relates to the issue of dampness within the structure, and the third involves corrosion inside the structure. This study employs image classification techniques to discern between intact and impaired structures within structural data. The aim of this research is to find automatic damage detection with the probability of each damage class being present in one image. Based on this probability, we know which class has a higher probability or is more affected than the other classes. Utilizing photographs captured by a mobile camera serves as the input for an image classification system. Image classification was employed in our study to perform multi-class and multi-label classification. The objective was to categorize structural data based on the presence of cracks, moisture, and corrosion. In the context of multi-class image classification, our study employed three distinct methodologies: Random Forest, Multilayer Perceptron, and CNN. For the task of multi-label image classification, the models employed were Rasnet, Xceptionet, and Inception.Keywords: SHM, CNN, deep learning, multi-class classification, multi-label classification
Procedia PDF Downloads 363300 Infodemic Detection on Social Media with a Multi-Dimensional Deep Learning Framework
Authors: Raymond Xu, Cindy Jingru Wang
Abstract:
Social media has become a globally connected and influencing platform. Social media data, such as tweets, can help predict the spread of pandemics and provide individuals and healthcare providers early warnings. Public psychological reactions and opinions can be efficiently monitored by AI models on the progression of dominant topics on Twitter. However, statistics show that as the coronavirus spreads, so does an infodemic of misinformation due to pandemic-related factors such as unemployment and lockdowns. Social media algorithms are often biased toward outrage by promoting content that people have an emotional reaction to and are likely to engage with. This can influence users’ attitudes and cause confusion. Therefore, social media is a double-edged sword. Combating fake news and biased content has become one of the essential tasks. This research analyzes the variety of methods used for fake news detection covering random forest, logistic regression, support vector machines, decision tree, naive Bayes, BoW, TF-IDF, LDA, CNN, RNN, LSTM, DeepFake, and hierarchical attention network. The performance of each method is analyzed. Based on these models’ achievements and limitations, a multi-dimensional AI framework is proposed to achieve higher accuracy in infodemic detection, especially pandemic-related news. The model is trained on contextual content, images, and news metadata.Keywords: artificial intelligence, fake news detection, infodemic detection, image recognition, sentiment analysis
Procedia PDF Downloads 2553299 Ethnobotanical Survey of Vegetable Plants Traditionally Used in Kalasin Thailand
Authors: Aree Thongpukdee, Chockpisit Thepsithar, Chuthalak Thammaso
Abstract:
Use of plants grown in local area for edible has a long tradition in different culture. The indigenous knowledge such as usage of plants as vegetables by local people is risk to disappear when no records are done. In order to conserve and transfer this valuable heritage to the new generation, ethnobotanical study should be investigated and documented. The survey of vegetable plants traditionally used was carried out in the year 2012. Information was accumulated via questionnaires and oral interviewing from 100 people living in 36 villages of 9 districts in Amphoe Huai Mek, Kalasin, Thailand. Local plant names, utilized parts and preparation methods of the plants were recorded. Each mentioned plant species were collected and voucher specimens were prepared. A total of 55 vegetable plant species belonging to 34 families and 54 genera were identified. The plant habits were tree, shrub, herb, climber, and shrubby fern at 21.82%, 18.18%, 38.18%, 20.00% and 1.82% respectively. The most encountered vegetable plant families were Leguminosae (20%), Cucurbitaceae (7.27%), Apiaceae (5.45%), whereas families with 3.64% uses were Araceae, Bignoniaceae, Lamiaceae, Passifloraceae, Piperaceae and Solanaceae. The most common consumptions were fresh or brief boiled young shoot or young leaf as side dishes of ‘jaeo, laab, namprik, pon’ or curries. Most locally known vegetables included 45% of the studied plants which grow along road side, backyard garden, hedgerow, open forest and rice field.Keywords: vegetable plants, ethnobotanical survey, Kalasin, Thailand
Procedia PDF Downloads 3143298 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection
Authors: Muhammad Ali
Abstract:
Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection
Procedia PDF Downloads 1253297 A Dynamic Solution Approach for Heart Disease Prediction
Authors: Walid Moudani
Abstract:
The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets
Procedia PDF Downloads 4103296 Unveiling the Reaction Mechanism of N-Nitroso Dimethyl Amine Formation from Substituted Hydrazine Derivatives During Ozonation: A Computational Study
Authors: Rehin Sulay, Anandhu Krishna, Jintumol Mathew, Vibin Ipe Thomas
Abstract:
N-Nitrosodimethyl amine, the simplest member of the N-Nitrosoamine family, is a carcinogenic and mutagenic agent that has gained considerable research interest owing to its toxic nature. Ozonation of industrially important hydrazines such as unsymmetrical dimethylhydrazine (UDMH) or monomethylhydrazine (MMH) has been associated with NDMA formation and accumulation in the environment. UDMH/MMH - ozonation also leads to several other transformation products such as acetaldehyde dimethyl hydrazone (ADMH), tetramethyl tetra azene (TMT), diazomethane, methyl diazene, etc, which can be either precursors or competitors for NDMA formation.In this work, we explored the formation mechanism of ADMH and TMT from UDMH-ozonation and their further oxidation to NDMA using the second-order Moller Plesset perturbation theory employing the 6-311G(d) basis set. We have also investigated how MMH selectively forms methyl diazene and diazomethane under normal conditions and NDMA in the presence of excess ozone. Our calculations indicate that the reactions proceed via an initial H abstraction from the hydrazine –NH2 group followed by the oxidation of the generated N-radical species. The formation of ADMH from the UDMH-ozone reaction involves an acetaldehyde intermediate, which then reacts with a second UDMH molecule to generate ADMH. The preferable attack of ozone molecule on N=C bond of ADMH generates DMAN intermediate, which subsequently undergoes oxidation to form NDMA. Unlike other transformation products, TMT formation occurs via the dimerization of DMAN. Though there exist a N=N bonds in the TMT, which are preferable attacking sites for ozone, experimental studies show the lower yields of NDMA formation, which corroborates with the high activation barrier required for the process(42kcal/mol).Overall, our calculated results agree well with the experimental observations and rate constants. Computational calculations bring insights into the electronic nature and kinetics of the elementary reactions of this pathway, enabled by computed energies of structures that are not possible to access experimentally.Keywords: reaction mechanism, ozonation, substituted hydrazine, transition state
Procedia PDF Downloads 823295 Transition in Protein Profile, Maillard Reaction Products and Lipid Oxidation of Flavored Ultra High Temperature Treated Milk
Authors: Muhammad Ajmal
Abstract:
- Thermal processing and subsequent storage of ultra-heat treated (UHT) milk leads to alteration in protein profile, Maillard reaction and lipid oxidation. Concentration of carbohydrates in normal and flavored version of UHT milk is considerably different. Transition in protein profile, Maillard reaction and lipid oxidation in UHT flavored milk was determined for 90 days at ambient conditions and analyzed at 0, 45 and 90 days of storage. Protein profile, hydroxymethyl furfural, furosine, Nε-carboxymethyl-l-lysine, fatty acid profile, free fatty acids, peroxide value and sensory characteristics were determined. After 90 days of storage, fat, protein, total solids contents and pH were significantly less than the initial values determined at 0 day. As compared to protein profile normal UHT milk, more pronounced changes were recorded in different fractions of protein in UHT milk at 45 and 90 days of storage. Tyrosine content of flavored UHT milk at 0, 45 and 90 days of storage were 3.5, 6.9 and 15.2 µg tyrosine/ml. After 45 days of storage, the decline in αs1-casein, αs2-casein, β-casein, κ-casein, β-lactoglobulin, α-lactalbumin, immunoglobulin and bovine serum albumin were 3.35%, 10.5%, 7.89%, 18.8%, 53.6%, 20.1%, 26.9 and 37.5%. After 90 days of storage, the decline in αs1-casein, αs2-casein, β-casein, κ-casein, β-lactoglobulin, α-lactalbumin, immunoglobulin and bovine serum albumin were 11.2%, 34.8%, 14.3%, 33.9%, 56.9%, 24.8%, 36.5% and 43.1%. Hydroxy methyl furfural content of UHT milk at 0, 45 and 90 days of storage were 1.56, 4.18 and 7.61 (µmol/L). Furosine content of flavored UHT milk at 0, 45 and 90 days of storage intervals were 278, 392 and 561 mg/100g protein. Nε-carboxymethyl-l-lysine content of UHT flavored milk at 0, 45 and 90 days of storage were 67, 135 and 343mg/kg protein. After 90 days of storage of flavored UHT milk, the loss of unsaturated fatty acids 45.7% from the initial values. At 0, 45 and 90 days of storage, free fatty acids of flavored UHT milk were 0.08%, 0.11% and 0.16% (p<0.05). Peroxide value of flavored UHT milk at 0, 45 and 90 days of storage was 0.22, 0.65 and 2.88 (MeqO²/kg). Sensory analysis of flavored UHT milk after 90 days indicated that appearance, flavor and mouth feel score significantly decreased from the initial values recorded at 0 day. Findings of this investigation evidenced that in flavored UHT milk more pronounced changes take place in protein profile, Maillard reaction products and lipid oxidation as compared to normal UHT milk.Keywords: UHT flavored milk , hydroxymethyl furfural, lipid oxidation, sensory properties
Procedia PDF Downloads 1993294 Rheolaser: Light Scattering Characterization of Viscoelastic Properties of Hair Cosmetics That Are Related to Performance and Stability of the Respective Colloidal Soft Materials
Authors: Heitor Oliveira, Gabriele De-Waal, Juergen Schmenger, Lynsey Godfrey, Tibor Kovacs
Abstract:
Rheolaser MASTER™ makes use of multiple scattering of light, caused by scattering objects in a continuous medium (such as droplets and particles in colloids), to characterize the viscoelasticity of soft materials. It offers an alternative to conventional rheometers to characterize viscoelasticity of products such as hair cosmetics. Up to six simultaneous measurements at controlled temperature can be carried out simultaneously (10-15 min), and the method requires only minor sample preparation work. Conversely to conventional rheometer based methods, no mechanical stress is applied to the material during the measurements. Therefore, the properties of the exact same sample can be monitored over time, like in aging and stability studies. We determined the elastic index (EI) of water/emulsion mixtures (1 ≤ fat alcohols (FA) ≤ 5 wt%) and emulsion/gel-network mixtures (8 ≤ FA ≤ 17 wt%) and compared with the elastic/sorage mudulus (G’) for the respective samples using a TA conventional rheometer with flat plates geometry. As expected, it was found that log(EI) vs log(G’) presents a linear behavior. Moreover, log(EI) increased in a linear fashion with solids level in the entire range of compositions (1 ≤ FA ≤ 17 wt%), while rheometer measurements were limited to samples down to 4 wt% solids level. Alternatively, a concentric cilinder geometry would be required for more diluted samples (FA > 4 wt%) and rheometer results from different sample holder geometries are not comparable. The plot of the rheolaser output parameters solid-liquid balance (SLB) vs EI were suitable to monitor product aging processes. These data could quantitatively describe some observations such as formation of lumps over aging time. Moreover, this method allowed to identify that the different specifications of a key raw material (RM < 0.4 wt%) in the respective gel-network (GN) product has minor impact on product viscoelastic properties and it is not consumer perceivable after a short aging time. Broadening of a RM spec range typically has a positive impact on cost savings. Last but not least, the photon path length (λ*)—proportional to droplet size and inversely proportional to volume fraction of scattering objects, accordingly to the Mie theory—and the EI were suitable to characterize product destabilization processes (e.g., coalescence and creaming) and to predict product stability about eight times faster than our standard methods. Using these parameters we could successfully identify formulation and process parameters that resulted in unstable products. In conclusion, Rheolaser allows quick and reliable characterization of viscoelastic properties of hair cosmetics that are related to their performance and stability. It operates in a broad range of product compositions and has applications spanning from the formulation of our hair cosmetics to fast release criteria in our production sites. Last but not least, this powerful tool has positive impact on R&D development time—faster delivery of new products to the market—and consequently on cost savings.Keywords: colloids, hair cosmetics, light scattering, performance and stability, soft materials, viscoelastic properties
Procedia PDF Downloads 1723293 Human TP53 Three Dimentional (3D) Core Domain Hot Spot Mutations at Codon, 36, 72 and 240 are Associated with Oral Squamous Cell Carcinoma
Authors: Saima Saleem, Zubair Abbasi, Abdul Hameed, Mansoor Ahmed Khan, Navid Rashid Qureshi, Abid Azhar
Abstract:
Oral Squamous Cell Carcinoma (OSCC) is the leading cause of death in the developing countries like Pakistan. This problem aggravates because of the excessive use of available chewing products. In spite of widespread information on their use and purported legislations against their use the Pakistani markets are classical examples of selling chewable carcinogenic mutagens. Reported studies indicated that these products are rich in reactive oxygen species (ROS) and polyphenols. TP53 gene is involved in the suppression of tumor. It has been reported that somatic mutations caused by TP53 gene are the foundation of the cancer. This study aims to find the loss of TP53 functions due to mutation/polymorphism caused by genomic alteration and interaction with tobacco and its related ingredients. Total 260 tissues and blood specimens were collected from OSCC patients and compared with age and sex matched controls. Mutations in exons 2-11 of TP53 were examined by PCR-SSCP. Samples showing mobility shift were directly sequenced. Two mutations were found in exon 4 at nucleotide position 108 and 215 and one in exon 7 at nucleotide position 719 of the coding sequences in patient’s tumor samples. These results show that substitution of proline with arginine at codon 72 and serine with threonine at codon 240 of p53 protein. These polymorphic changes, found in tumor samples of OSCC, could be involved in loss of heterozygocity and apoptotic activity in the binding domain of TP53. The model of the mutated TP53 gene elaborated a nonfunctional unfolded p53 protein, suggesting an important role of these mutations in p53 protein inactivation and malfunction. This nonfunctional 3D model also indicates that exogenous tobacco related carcinogens may act as DNA-damaging agents affecting the structure of DNA. The interpretations could be helpful in establishing the pathways responsible for tumor formation in OSCC patients.Keywords: TP53 mutation/polymorphism, OSCC, PCR-SSCP, direct DNA sequencing, 3D structure
Procedia PDF Downloads 3663292 Impact of the Hayne Royal Commission on the Operating Model of Australian Financial Advice Firms
Authors: Mohammad Abu-Taleb
Abstract:
The final report of the Royal Commission into Australian financial services misconduct, released in February 2019, has had a significant impact on the financial advice industry. The recommendations released in the Commissioner’s final report include changes to ongoing fee arrangements, a new disciplinary system for financial advisers, and mandatory reporting of compliance concerns. This thesis aims to explore the impact of the Royal Commission’s recommendations on the operating model of financial advice firms in terms of advice products, processes, delivery models, and customer segments. Also, this research seeks to investigate whether the Royal Commission’s outcome has accelerated the use of enhanced technology solutions within the operating model of financial advice firms. And to identify the key challenges confronting financial advice firms whilst implementing the Commissioner’s recommendations across their operating models. In order to achieve the objectives of this thesis, a qualitative research design has been adopted through semi-structured in-depth interviews with 24 financial advisers and managers who are engaged in the operation of financial advice services. The study used the thematic analysis approach to interpret the qualitative data collected from the interviews. The findings of this thesis reveal that customer-centric operating models will become more prominent across the financial advice industry in response to the Commissioner’s final report. And the Royal Commission’s outcome has accelerated the use of advice technology solutions within the operating model of financial advice firms. In addition, financial advice firms have started more than before using simpler and more automated web-based advice services, which enable financial advisers to provide simple advice in a greater scale, and also to accelerate the use of robo-advice models and digital delivery to mass customers in the long term. Furthermore, the study identifies process and technology changes as, long with technical and interpersonal skills development, as the key challenges encountered financial advice firms whilst implementing the Commissioner’s recommendations across their operating models.Keywords: hayne royal commission, financial planning advice, operating model, advice products, advice processes, delivery models, customer segments, digital advice solutions
Procedia PDF Downloads 883291 A Techno-Economic Evaluation of Bio Fuel Production from Waste of Starting Dates in South Algeria
Authors: Insaf Mehani, Bachir Bouchekima
Abstract:
The necessary reduction and progressive consumption of fossil fuels, whose scarcity is inevitable, involves mobilizing a set of alternatives.Renewable energy, including bio energy are an alternative to fossil fuel depletion and a way to fight against the harmful effects of climate change. It is possible to develop common dates of low commercial value, and put on the local and international market a new generation of products with high added values such as bio ethanol. Besides its use in chemical synthesis, bio ethanol can be blended with gasoline to produce a clean fuel while improving the octane.Keywords: bioenergy, dates, bioethanol, renewable energy, south Algeria
Procedia PDF Downloads 4893290 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems
Authors: Sagir M. Yusuf, Chris Baber
Abstract:
In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence
Procedia PDF Downloads 1193289 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 2853288 IT System in the Food Supply Chain Safety, Application in SMEs Sector
Authors: Mohsen Shirani, Micaela Demichela
Abstract:
Food supply chain is one of the most complex supply chain networks due to its perishable nature and customer oriented products, and food safety is the major concern for this industry. IT system could help to minimize the production and consumption of unsafe food by controlling and monitoring the entire system. However, there have been many issues in adoption of IT system in this industry specifically within SMEs sector. With this regard, this study presents a novel approach to use IT and tractability systems in the food supply chain, using application of RFID and central database.Keywords: food supply chain, IT system, safety, SME
Procedia PDF Downloads 4773287 Development of Internet of Things (IoT) with Mobile Voice Picking and Cargo Tracing Systems in Warehouse Operations of Third-Party Logistics
Authors: Eugene Y. C. Wong
Abstract:
The increased market competition, customer expectation, and warehouse operating cost in third-party logistics have motivated the continuous exploration in improving operation efficiency in warehouse logistics. Cargo tracing in ordering picking process consumes excessive time for warehouse operators when handling enormous quantities of goods flowing through the warehouse each day. Internet of Things (IoT) with mobile cargo tracing apps and database management systems are developed this research to facilitate and reduce the cargo tracing time in order picking process of a third-party logistics firm. An operation review is carried out in the firm with opportunities for improvement being identified, including inaccurate inventory record in warehouse management system, excessive tracing time on stored products, and product misdelivery. The facility layout has been improved by modifying the designated locations of various types of products. The relationship among the pick and pack processing time, cargo tracing time, delivery accuracy, inventory turnover, and inventory count operation time in the warehouse are evaluated. The correlation of the factors affecting the overall cycle time is analysed. A mobile app is developed with the use of MIT App Inventor and the Access management database to facilitate cargo tracking anytime anywhere. The information flow framework from warehouse database system to cloud computing document-sharing, and further to the mobile app device is developed. The improved performance on cargo tracing in the order processing cycle time of warehouse operators have been collected and evaluated. The developed mobile voice picking and tracking systems brings significant benefit to the third-party logistics firm, including eliminating unnecessary cargo tracing time in order picking process and reducing warehouse operators overtime cost. The mobile tracking device is further planned to enhance the picking time and cycle count of warehouse operators with voice picking system in the developed mobile apps as future development.Keywords: warehouse, order picking process, cargo tracing, mobile app, third-party logistics
Procedia PDF Downloads 3743286 Allometric Models for Biomass Estimation in Savanna Woodland Area, Niger State, Nigeria
Authors: Abdullahi Jibrin, Aishetu Abdulkadir
Abstract:
The development of allometric models is crucial to accurate forest biomass/carbon stock assessment. The aim of this study was to develop a set of biomass prediction models that will enable the determination of total tree aboveground biomass for savannah woodland area in Niger State, Nigeria. Based on the data collected through biometric measurements of 1816 trees and destructive sampling of 36 trees, five species specific and one site specific models were developed. The sample size was distributed equally between the five most dominant species in the study site (Vitellaria paradoxa, Irvingia gabonensis, Parkia biglobosa, Anogeissus leiocarpus, Pterocarpus erinaceous). Firstly, the equations were developed for five individual species. Secondly these five species were mixed and were used to develop an allometric equation of mixed species. Overall, there was a strong positive relationship between total tree biomass and the stem diameter. The coefficient of determination (R2 values) ranging from 0.93 to 0.99 P < 0.001 were realised for the models; with considerable low standard error of the estimates (SEE) which confirms that the total tree above ground biomass has a significant relationship with the dbh. The F-test value for the biomass prediction models were also significant at p < 0.001 which indicates that the biomass prediction models are valid. This study recommends that for improved biomass estimates in the study site, the site specific biomass models should preferably be used instead of using generic models.Keywords: allometriy, biomass, carbon stock , model, regression equation, woodland, inventory
Procedia PDF Downloads 4483285 Choice Analysis of Ground Access to São Paulo/Guarulhos International Airport Using Adaptive Choice-Based Conjoint Analysis (ACBC)
Authors: Carolina Silva Ansélmo
Abstract:
Airports are demand-generating poles that affect the flow of traffic around them. The airport access system must be fast, convenient, and adequately planned, considering its potential users. An airport with good ground access conditions can provide the user with a more satisfactory access experience. When several transport options are available, service providers must understand users' preferences and the expected quality of service. The present study focuses on airport access in a comparative scenario between bus, private vehicle, subway, taxi and urban mobility transport applications to São Paulo/Guarulhos International Airport. The objectives are (i) to identify the factors that influence the choice, (ii) to measure Willingness to Pay (WTP), and (iii) to estimate the market share for each modal. The applied method was Adaptive Choice-based Conjoint Analysis (ACBC) technique using Sawtooth Software. Conjoint analysis, rooted in Utility Theory, is a survey technique that quantifies the customer's perceived utility when choosing alternatives. Assessing user preferences provides insights into their priorities for product or service attributes. An additional advantage of conjoint analysis is its requirement for a smaller sample size compared to other methods. Furthermore, ACBC provides valuable insights into consumers' preferences, willingness to pay, and market dynamics, aiding strategic decision-making to provide a better customer experience, pricing, and market segmentation. In the present research, the ACBC questionnaire had the following variables: (i) access time to the boarding point, (ii) comfort in the vehicle, (iii) number of travelers together, (iv) price, (v) supply power, and (vi) type of vehicle. The case study questionnaire reached 213 valid responses considering the scenario of access from the São Paulo city center to São Paulo/Guarulhos International Airport. As a result, the price and the number of travelers are the most relevant attributes for the sample when choosing airport access. The market share of the selection is mainly urban mobility transport applications, followed by buses, private vehicles, taxis and subways.Keywords: adaptive choice-based conjoint analysis, ground access to airport, market share, willingness to pay
Procedia PDF Downloads 783284 A Solar Heating System Performance on the Microclimate of an Agricultural Greenhouse
Authors: Nora Arbaoui, Rachid Tadili
Abstract:
The experiment adopted a natural technique of heating and cooling an agricultural greenhouse to reduce the fuel consumption and CO2 emissions based on the heating of a transfer fluid that circulates inside the greenhouse through a solar copper coil positioned at the roof of the greenhouse. This experimental study is devoted to the performance evaluation of a solar heating system to improve the microclimate of a greenhouse during the cold period, especially in the Mediterranean climate. This integrated solar system for heating has a positive impact on the quality and quantity of the products under the study greenhouse.Keywords: solar system, agricultural greenhouse, heating, storage
Procedia PDF Downloads 773283 Effects of Brewer's Yeast Peptide Extract on the Growth of Probiotics and Gut Microbiota
Authors: Manuela Amorim, Cláudia S. Marques, Maria Conceição Calhau, Hélder J. Pinheiro, Maria Manuela Pintado
Abstract:
Recently it has been recognized peptides from different food sources with biological activities. However, no relevant study has proven the potential of brewer yeast peptides in the modulation of gut microbiota. The importance of human intestinal microbiota in maintaining host health is well known. Probiotics, prebiotics and the combination of these two components, can contribute to support an adequate balance of the bacterial population in the human large intestine. The survival of many bacterial species inhabiting the large bowel depends essentially on the substrates made available to them, most of which come directly from the diet. Some of these substrates can be selectively considered as prebiotics, which are food ingredients that can stimulate beneficial bacteria such as Lactobacilli or Bifidobacteria growth in the colon. Moreover, conventional food can be used as vehicle to intake bioactive compounds that provide those health benefits and increase people well-being. In this way, the main objective of this work was to study the potential prebiotic activity of brewer yeast peptide extract (BYP) obtained via hydrolysis of yeast proteins by cardosins present in Cynara cardunculus extract for possible use as a functional ingredient. To evaluate the effect of BYP on the modulation of gut microbiota in diet-induced obesity model, Wistar rats were fed either with a standard or a high-fat diet. Quantified via 16S ribosomal RNA (rRNA) expression by quantitative PCR (qPCR), genera of beneficial bacteria (Lactobacillus spp. and Bifidobacterium spp.) and three main phyla (Firmicutes, Bacteroidetes and Actinobacteria) were assessed. Results showed relative abundance of Lactobacillus spp., Bifidobacterium spp. and Bacteroidetes was significantly increased (P < 0.05) by BYP. Consequently, the potential health-promoting effects of WPE through modulation of gut microbiota were demonstrated in vivo. Altogether, these findings highlight the possible intervention of BYP as gut microbiota enhancer, promoting healthy life style, and the incorporation in new food products, leads them bringing associated benefits endorsing a new trend in the improvement of new value-added food products.Keywords: functional ingredients, gut microbiota, prebiotics, brewer yeast peptide extract
Procedia PDF Downloads 5003282 Microfiber Release During Laundry Under Different Rinsing Parameters
Authors: Fulya Asena Uluç, Ehsan Tuzcuoğlu, Songül Bayraktar, Burak Koca, Alper Gürarslan
Abstract:
Microplastics are contaminants that are widely distributed in the environment with a detrimental ecological effect. Besides this, recent research has proved the existence of microplastics in human blood and organs. Microplastics in the environment can be divided into two main categories: primary and secondary microplastics. Primary microplastics are plastics that are released into the environment as microscopic particles. On the other hand, secondary microplastics are the smaller particles that are shed as a result of the consumption of synthetic materials in textile products as well as other products. Textiles are the main source of microplastic contamination in aquatic ecosystems. Laundry of synthetic textiles (34.8%) accounts for an average annual discharge of 3.2 million tons of primary microplastics into the environment. Recently, microfiber shedding from laundry research has gained traction. However, no comprehensive study was conducted from the standpoint of rinsing parameters during laundry to analyze microfiber shedding. The purpose of the present study is to quantify microfiber shedding from fabric under different rinsing conditions and determine the effective rinsing parameters on microfiber release in a laundry environment. In this regard, a parametric study is carried out to investigate the key factors affecting the microfiber release from a front-load washing machine. These parameters are the amount of water used during the rinsing step and the spinning speed at the end of the washing cycle. Minitab statistical program is used to create a design of the experiment (DOE) and analyze the experimental results. Tests are repeated twice and besides the controlled parameters, other washing parameters are kept constant in the washing algorithm. At the end of each cycle, released microfibers are collected via a custom-made filtration system and weighted with precision balance. The results showed that by increasing the water amount during the rinsing step, the amount of microplastic released from the washing machine increased drastically. Also, the parametric study revealed that increasing the spinning speed results in an increase in the microfiber release from textiles.Keywords: front load, laundry, microfiber, microfiber release, microfiber shedding, microplastic, pollution, rinsing parameters, sustainability, washing parameters, washing machine
Procedia PDF Downloads 98