Search results for: lean tools and techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10461

Search results for: lean tools and techniques

8241 Isolation, Preparation and Biological Properties of Soybean-Flaxseed Protein Co-Precipitates

Authors: Muhammad H. Alu’datt, Inteaz Alli

Abstract:

This study was conducted to prepare and evaluate the biological properties of protein co-precipitates from flaxseed and soybean. Protein was prepared by NaOH extraction through the mixing of soybean flour (Sf) and flaxseed flour (Ff) or mixtures of soybean extract (Se) and flaxseed extract (Fe). The protein co-precipitates were precipitated by isoelectric (IEP) and isoelectric-heating (IEPH) co-precipitation techniques. Effects of extraction and co-precipitation techniques on co-precipitate yield were investigated. Native-PAGE, SDS-PAGE were used to study the molecular characterization. Content and antioxidant activity of extracted free and bound phenolic compounds were evaluated for protein co-precipitates. Removal of free and bound phenolic compounds from protein co-precipitates showed little effects on the electrophoretic behavior of the proteins or the protein subunits of protein co-precipitates. Results showed that he highest protein contents and yield were obtained in for Sf-Ff/IEP co-precipitate with values of 53.28 and 25.58% respectively as compared to protein isolates and other co-precipitates. Results revealed that the Sf-Ff/IEP showed a higher content of bound phenolic compounds (53.49% from total phenolic content) as compared to free phenolic compounds (46.51% from total phenolic content). Antioxidant activities of extracted bound phenolic compounds with and without heat treatment from Sf-Ff/IEHP were higher as compared to free phenolic compounds extracted from other protein co-precipitates (29.68 and 22.84%, respectively).

Keywords: antioxidant, phenol, protein co-precipitate, yield

Procedia PDF Downloads 242
8240 Improvement of Cardiometabolic after 8 Weeks of Weight Loss Intervention

Authors: Boris Bajer, Andrea Havranova, Miroslav Vlcek, Richard Imrich, Adela Penesova

Abstract:

Lifestyle interventions can prevent the deterioration of impaired glucose tolerance to manifest type 2 diabetes, and also prevent cardiovascular diseases, as it showed many studies (the Finnish Diabetes Prevention Study, Diabetes Prevention Program (DPP), . the China Da Qing Diabetes Prevention Study, etc.) Therefore the aim of our study was to compare the effect of intensified lifestyle intervention on cardiometabolic parameters. Methods: It is an ongoing randomized interventional clinical study (NCT02325804) focused on the reduction of body weight/fat. Intervention: hypocaloric diet (30% restriction of calories) and physical activity 150 minutes/week. Before and after 8 weeks of intervention all patients underwent complete medical examination (measurement of physical fitness, resting metabolic rate (RMR), body composition analysis, oral glucose tolerance test, parameters of lipid metabolism, and other cardiometabolic risk factors. Results: So far 39 patients finished the intervention. The average reduction of body weight was 6,8 + 4,9 kg (0-15 kg; p=0,0006), accompanied with significant reduction of body fat percentage (p ≤ 0,0001), amount of fat mass (p=0,03), waist circumference (p=0.02). Amount of lean mass and RMR remained unchanged. Heart rate (p=0,02), systolic and diastolic blood pressure was reduced (p=0,01 p=0,02 resp.) as well as insulin sensitivity was improved. Lipid parameters also changed - cholesterol, LDL decreased (p=0,05, p=0,04 resp.), while triglycerides showed tendency to decrease (p=0,055). Liver function improved, alanine aminotrasnferase (ALT) were reduced (p=0,01). Physical fitness significantly improved (as measure VO2 max (p=0,02). Conclusion: Results of our study are in line with previous results about the beneficial effect of intensive lifestyle changes on the reduction of cardiometabolic risk factors and improvement of liver function. Supported by grants APVV 15-0228; VEGA 2/0161/16

Keywords: obesity, weight loss, diet lipids, blood pressure, liver enzymes

Procedia PDF Downloads 167
8239 Factors Affecting Test Automation Stability and Their Solutions

Authors: Nagmani Lnu

Abstract:

Test automation is a vital requirement of any organization to release products faster to their customers. In most cases, an organization has an approach to developing automation but struggles to maintain it. It results in an increased number of Flaky Tests, reducing return on investments and stakeholders’ confidence. Challenges grow in multiple folds when automation is for UI behaviors. This paper describes the approaches taken to identify the root cause of automation instability in an extensive payments application and the best practices to address that using processes, tools, and technologies, resulting in a 75% reduction of effort.

Keywords: automation stability, test stability, Flaky Test, test quality, test automation quality

Procedia PDF Downloads 87
8238 Applying Artificial Neural Networks to Predict Speed Skater Impact Concussion Risk

Authors: Yilin Liao, Hewen Li, Paula McConvey

Abstract:

Speed skaters often face a risk of concussion when they fall on the ice floor and impact crash mats during practices and competitive races. Several variables, including those related to the skater, the crash mat, and the impact position (body side/head/feet impact), are believed to influence the severity of the skater's concussion. While computer simulation modeling can be employed to analyze these accidents, the simulation process is time-consuming and does not provide rapid information for coaches and teams to assess the skater's injury risk in competitive events. This research paper promotes the exploration of the feasibility of using AI techniques for evaluating skater’s potential concussion severity, and to develop a fast concussion prediction tool using artificial neural networks to reduce the risk of treatment delays for injured skaters. The primary data is collected through virtual tests and physical experiments designed to simulate skater-mat impact. It is then analyzed to identify patterns and correlations; finally, it is used to train and fine-tune the artificial neural networks for accurate prediction. The development of the prediction tool by employing machine learning strategies contributes to the application of AI methods in sports science and has theoretical involvements for using AI techniques in predicting and preventing sports-related injuries.

Keywords: artificial neural networks, concussion, machine learning, impact, speed skater

Procedia PDF Downloads 113
8237 Electrodeposition and Selenization of Cuin Alloys for the Synthesis of Photoactive Cu2in1-X Gax Se2 (Cigs) Thin Films

Authors: Mohamed Benaicha, Mahdi Allam

Abstract:

A new two stage electrochemical process as a safe, large area and low processing cost technique for the production of semi-conducting CuInSe2 (CIS) thin films is studied. CuIn precursors were first potentiostatically electrodeposited onto molybdenum substrates from an acidic thiocyanate electrolyte. In a second stage, the prepared metallic CuIn layers were used as substrate in the selenium electrochemical deposition system and subjected to a thermal treatment in vacuum atmosphere, to eliminate binary phase formation by reaction of the Cu2-x Se and InxSey selenides, leading to the formation of CuInSe2 thin film. Electrochemical selenization from aqueous electrolyte is introduced as an alternative to toxic and hazardous H2Se or Se vapor phase selenization used in physical techniques. In this study, the influence of film deposition parameters such as bath composition, temperature and potential on film properties was studied. The electrochemical, morphological, structural and compositional properties of electrodeposited thin films were characterized using various techniques. Results of Cyclic and Stripping-Cyclic Voltammetry (CV, SCV), Scanning Electron Microscopy (SEM) and Energy Dispersive X-Ray microanalysis (EDX) investigations revealed good reproducibility and homogeneity of the film composition. Thereby optimal technological parameters for the electrochemical production of CuIn, Se as precursors for CuInSe2 thin layers are determined.

Keywords: photovoltaic, CIGS, copper alloys, electrodeposition, thin films

Procedia PDF Downloads 464
8236 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach

Authors: Kanika Gupta, Ashok Kumar

Abstract:

Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.

Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database

Procedia PDF Downloads 174
8235 Customer Relationship Management: An Essential Tool for Librarians

Authors: Pushkar Lal Sharma, Sanjana Singh, Umesh Kumar Sahu

Abstract:

This paper helps to understand the need of Customer Relationship Management in Libraries and why Librarians should implement the marketing concept of Customer Relationship Management in their libraries. As like any industry, libraries too face growing challenges to continuously meet customer expectations, and attract and retain users in light of overflowing competition. The ability to understand customers, build relationships and market diverse services is essential when considering ways to expand service offerings and improve Return on Investment. Since Library is service oriented Enterprise, hence the Customer/User/ Reader/Patron are the most important element of Library & Information System to whom and for whom library offers various services. How to provide better and most efficient services to its users is the main concern of every Library & Information centre in the present era. The basic difference between Business Enterprise and Library Information System is that ‘in Business System ‘the efficiency is measured in terms of ’profit’ or ‘monetary gains’; whereas in a Library & Information System, the efficiency is measured in terms of ‘services’ and therefore the goals that are set in Business Enterprise are’ profit oriented’ whereas goals set in the Library & Information Centre are ‘Service-oriented’. With the explosion of information and advancement of technology readers have so many choices to get information rather than visiting a library. Everything is available at the click of a mouse, library customers have become more knowledgeable and demanding in an era marked by abundance of information resources and services. With this explosion of information in every field of knowledge and choice in selection of service, satisfying user has become a challenge now a day for libraries. Accordingly, Libraries have to build good relationship with its users by adopting Customer relationship Management. CRM refers to the methods and tools which help an organization to manage its relationship with its customers in an organized way. The Customer Relationship Management (CRM) combines business strategy and technology to identify, acquire and retain good customer relationship. The goal of CRM is to optimize management of customer information needs & interests and increase customer satisfaction and loyalty. Implementing CRM in Libraries can improve customer data and process management, customer loyalty, retention and satisfaction.

Keywords: customer relationship management, CRM, CRM tools, customer satisfaction

Procedia PDF Downloads 72
8234 Biomedicine, Suffering, and Sacrifice: Myths and Prototypes in Cell and Gene Therapies

Authors: Edison Bicudo

Abstract:

Cell and gene therapies (CGTs) result from the intense manipulation of cells or the use of techniques such as gene editing. They have been increasingly used to tackle rare diseases or conditions of genetic origin, such as cancer. One might expect such a complex scientific field to be dominated by scientific findings and evidence-based explanations. However, people engaged in scientific argumentation also mobilize a range of cognitive operations of which they are not fully aware, in addition to drawing on widely available oral traditions. This paper analyses how experts discussing the potentialities and challenges of CGTs have recourse to a particular kind of prototypical myth. This sociology study, conducted at the University of Sussex (UK), involved interviews with scientists, regulators, and entrepreneurs involved in the development or governance of CGTs. It was observed that these professionals, when voicing their views, sometimes have recourse to narratives where CGTs appear as promising tools for alleviating or curing diseases. This is said to involve much personal, scientific, and financial sacrifice. In his study of traditional narratives, Hogan identified three prototypes: the romantic narrative, moved by the ideal of romantic union; the heroic narrative, moved by the desire for political power; and the sacrificial narrative, where the ideal is plenty, well-being, and health. It is argued here that discourses around CGTs often involve some narratives – or myths – that have a sacrificial nature. In this sense, the development of innovative therapies is depicted as a huge sacrificial endeavor involving biomedical scientists, biotech and pharma companies, and decision-makers. These sacrificial accounts draw on oral traditions and benefit from an emotional intensification that can be easily achieved in stories of serious diseases and physical suffering. Furthermore, these accounts draw on metaphorical understandings where diseases and vectors of diseases are considered enemies or invaders while therapies are framed as shields or protections. In this way, this paper aims to unravel the cognitive underpinnings of contemporary science – and, more specifically, biomedicine – revealing how myths, prototypes, and metaphors are highly operative even when complex reasoning is at stake. At the same time, this paper demonstrates how such hidden cognitive operations underpin the construction of powerful ideological discourses aimed at defending certain ways of developing, disseminating, and governing technologies and therapies.

Keywords: cell and gene therapies, myths, prototypes, metaphors

Procedia PDF Downloads 22
8233 Optimization of Hot Metal Charging Circuit in a Steel Melting Shop Using Industrial Engineering Techniques for Achieving Manufacturing Excellence

Authors: N. Singh, A. Khullar, R. Shrivastava, I. Singh, A. S. Kumar

Abstract:

Steel forms the basis of any modern society and is essential to economic growth. India’s annual crude steel production has seen a consistent increase over the past years and is poised to grow to 300 million tons per annum by 2030-31 from current level of 110-120 million tons per annum. Steel industry is highly capital-intensive industry and to remain competitive, it is imperative that it invests in operational excellence. Due to inherent nature of the industry, there is large amount of variability in its supply chain both internally and externally. Production and productivity of a steel plant is greatly affected by the bottlenecks present in material flow logistics. The internal logistics constituting of transport of liquid metal within a steel melting shop (SMS) presents an opportunity in increasing the throughput with marginal capital investment. The study was carried out at one of the SMS of an integrated steel plant located in the eastern part of India. The plant has three SMS’s and the study was carried out at one of them. The objective of this study was to identify means to optimize SMS hot metal logistics through application of industrial engineering techniques. The study also covered the identification of non-value-added activities and proposed methods to eliminate the delays and improve the throughput of the SMS.

Keywords: optimization, steel making, supply chain, throughput enhancement, workforce productivity

Procedia PDF Downloads 121
8232 Optimizing the Use of Google Translate in Translation Teaching: A Case Study at Prince Sultan University

Authors: Saadia Elamin

Abstract:

The quasi-universal use of smart phones with internet connection available all the time makes it a reflex action for translation undergraduates, once they encounter the least translation problem, to turn to the freely available web resource: Google Translate. Like for other translator resources and aids, the use of Google Translate needs to be moderated in such a way that it contributes to developing translation competence. Here, instead of interfering with students’ learning by providing ready-made solutions which might not always fit into the contexts of use, it can help to consolidate the skills of analysis and transfer which students have already acquired. One way to do so is by training students to adhere to the basic principles of translation work. The most important of these is that analyzing the source text for comprehension comes first and foremost before jumping into the search for target language equivalents. Another basic principle is that certain translator aids and tools can be used for comprehension, while others are to be confined to the phase of re-expressing the meaning into the target language. The present paper reports on the experience of making a measured and reasonable use of Google Translate in translation teaching at Prince Sultan University (PSU), Riyadh. First, it traces the development that has taken place in the field of translation in this age of information technology, be it in translation teaching and translator training, or in the real-world practice of the profession. Second, it describes how, with the aim of reflecting this development onto the way translation is taught, senior students, after being trained on post-editing machine translation output, are authorized to use Google Translate in classwork and assignments. Third, the paper elaborates on the findings of this case study which has demonstrated that Google Translate, if used at the appropriate levels of training, can help to enhance students’ ability to perform different translation tasks. This help extends from the search for terms and expressions, to the tasks of drafting the target text, revising its content and finally editing it. In addition, using Google Translate in this way fosters a reflexive and critical attitude towards web resources in general, maximizing thus the benefit gained from them in preparing students to meet the requirements of the modern translation job market.

Keywords: Google Translate, post-editing machine translation output, principles of translation work, translation competence, translation teaching, translator aids and tools

Procedia PDF Downloads 477
8231 Formalizing the Sense Relation of Hyponymy from Logical Point of View: A Study of Mathematical Linguistics in Farsi

Authors: Maryam Ramezankhani

Abstract:

The present research tries to study the possibility of formalizing the sense relation of hyponymy. It applied mathematical tools and also uses mathematical logic concepts especially those from propositional logic. In order to do so, firstly, it goes over the definitions of hyponymy presented in linguistic dictionaries and semantic textbooks. Then, it introduces a formal translation of the sense relation of hyponymy. Lastly, it examines the efficiency of the suggested formula by some examples of natural language.

Keywords: sense relations, hyponymy, formalizing, words’ sense relation, formalizing sense relations

Procedia PDF Downloads 242
8230 Coding and Decoding versus Space Diversity for ‎Rayleigh Fading Radio Frequency Channels ‎

Authors: Ahmed Mahmoud Ahmed Abouelmagd

Abstract:

The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.

Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, ‎convolution coding, viterbi decoding, space diversity

Procedia PDF Downloads 445
8229 Greatly Improved Dielectric Properties of Poly'vinylidene fluoride' Nanocomposites Using Ag-BaTiO₃ Hybrid Nanoparticles as Filler

Authors: K. Silakaew, P. Thongbai

Abstract:

There is an increasing need for high–permittivity polymer–matrix composites (PMC) owing to the rapid development of the electronics industry. Unfortunately, the dielectric permittivity of PMC is still too low ( < 80). Moreover, the dielectric loss tangent is usually high (tan > 0.1) when the dielectric permittivity of PMC increased. In this research work, the dielectric properties of poly(vinylidene fluoride) (PVDF)–based nanocomposites can be significantly improved by incorporating by silver–BaTiO3 (Ag–BT) ceramic hybrid nanoparticles. The Ag–BT/PVDF nanocomposites were fabricated using various volume fractions of Ag–BT hybrid nanoparticles (fAg–BT = 0–0.5). The Ag–BT/PVDF nanocomposites were characterized using several techniques. The main phase of Ag and BT can be detected by the XRD technique. The microstructure of the Ag–BT/PVDF nanocomposites was investigated to reveal the dispersion of Ag–BT hybrid nanoparticles because the dispersion state of a filler can have an effect on the dielectric properties of the nanocomposites. It was found that the filler hybrid nanoparticles were well dispersed in the PVDF matrix. The phase formation of PVDF phases was identified using the XRD and FTIR techniques. We found that the fillers can increase the polar phase of a PVDF polymer. The fabricated Ag–BT/PVDF nanocomposites are systematically characterized to explain the dielectric behavior in Ag–BT/PVDF nanocomposites. Interestingly, largely enhanced dielectric permittivity (>240) and suppressed loss tangent (tan<0.08) over a wide frequency range (102 – 105 Hz) are obtained. Notably, the dielectric permittivity is slightly dependent on temperature. The greatly enhanced dielectric permittivity was explained by the interfacial polarization between the Ag and PVDF interface, and due to a high permittivity of BT particles.

Keywords: BaTiO3, PVDF, polymer composite, dielectric properties

Procedia PDF Downloads 198
8228 Relationship Between Insulin Resistance and Some Coagulation and Fibrinolytic Parameters in Subjects With Metabolic Syndrome

Authors: Amany Ragab, Nashwa Khairat Abousamra, Omayma Saleh, Asmaa Higazy

Abstract:

Insulin resistance syndrome has been shown to be associated with many coagulation and fibrinolytic proteins and these associations suggest that some coagulation and fibrinolytic proteins have a role in atherothrombotic disorders. This study was conducted to determine the levels of some of the haemostatic parameters in subjects having metabolic syndrome and to correlate these values with the anthropometric and metabolic variables associated with this syndrome. The study included 46 obese non diabetic subjects of whom 28 subjects(group1) fulfilled the ATP III criteria of the metabolic syndrome and 18 subjects (group2) did not have metabolic syndrome as well as 14 lean subjects (group 3) of matched age and sex as a control group. Clinical and laboratory evaluation of the study groups stressed on anthropometric measurements (weight, height, body mass index, waist circumference, and sagittal abdominal diameter), blood pressure, and laboratory measurements of fasting plasma glucose, fasting insulin, serum lipids, tissue plasminogen activator (t-PA), antithrombin III activity (ATIII), protein C and von Willebrand factor (vWf) antigen. There was significant increase in the concentrations of t-PA and vWf antigens in subjects having metabolic syndrome (group 1) in comparison to the other groups while there were non-significant changes in the levels of protein C antigen and AT III activity. Both t-PA and vWf showed significant correlation with HOMA-IR as a measure of insulin sensitivity. The t-PA showed also significant correlation with most of the variables of metabolic syndrome including waist circumference, BMI, systolic blood pressure, fasting plasma glucose, fasting insulin, and HDL cholesterol. On the other hand, vWf showed significant correlations with fasting plasma glucose, fasting insulin and sagital abdominal diameter, with non-significant correlations with the other variables. Haemostatic and fibrinolytic parameters should be included in the features and characterization of the insulin resistance syndrome. t-PA and vWf antigens concentrations were increased in subjects with metabolic syndrome and correlated with the HOMA-IR measure of insulin sensitivity. Taking into consideration that both t-PA and vWf are mainly released from vascular endothelium, these findings could be an indicator of endothelial dysfunction in that group of subjects.

Keywords: insulin resistance, obesity, metabolic syndrome, coagulation

Procedia PDF Downloads 139
8227 A Furniture Industry Concept for a Sustainable Generative Design Platform Employing Robot Based Additive Manufacturing

Authors: Andrew Fox, Tao Zhang, Yuanhong Zhao, Qingping Yang

Abstract:

The furniture manufacturing industry has been slow in general to adopt the latest manufacturing technologies, historically relying heavily upon specialised conventional machinery. This approach not only requires high levels of specialist process knowledge, training, and capital investment but also suffers from significant subtractive manufacturing waste and high logistics costs due to the requirement for centralised manufacturing, with high levels of furniture product not re-cycled or re-used. This paper aims to address the problems by introducing suitable digital manufacturing technologies to create step changes in furniture manufacturing design, as the traditional design practices have been reported as building in 80% of environmental impact. In this paper, a 3D printing robot for furniture manufacturing is reported. The 3D printing robot mainly comprises a KUKA industrial robot, an Arduino microprocessor, and a self-assembled screw fed extruder. Compared to traditional 3D printer, the 3D printing robot has larger motion range and can be easily upgraded to enlarge the maximum size of the printed object. Generative design is also investigated in this paper, aiming to establish a combined design methodology that allows assessment of goals, constraints, materials, and manufacturing processes simultaneously. ‘Matrixing’ for part amalgamation and product performance optimisation is enabled. The generative design goals of integrated waste reduction increased manufacturing efficiency, optimised product performance, and reduced environmental impact institute a truly lean and innovative future design methodology. In addition, there is massive future potential to leverage Single Minute Exchange of Die (SMED) theory through generative design post-processing of geometry for robot manufacture, resulting in ‘mass customised’ furniture with virtually no setup requirements. These generatively designed products can be manufactured using the robot based additive manufacturing. Essentially, the 3D printing robot is already functional; some initial goals have been achieved and are also presented in this paper.

Keywords: additive manufacturing, generative design, robot, sustainability

Procedia PDF Downloads 134
8226 O-Functionalized CNT Mediated CO Hydro-Deoxygenation and Chain Growth

Authors: K. Mondal, S. Talapatra, M. Terrones, S. Pokhrel, C. Frizzel, B. Sumpter, V. Meunier, A. L. Elias

Abstract:

Worldwide energy independence is reliant on the ability to leverage locally available resources for fuel production. Recently, syngas produced through gasification of carbonaceous materials provided a gateway to a host of processes for the production of various chemicals including transportation fuels. The basis of the production of gasoline and diesel-like fuels is the Fischer Tropsch Synthesis (FTS) process: A catalyzed chemical reaction that converts a mixture of carbon monoxide (CO) and hydrogen (H2) into long chain hydrocarbons. Until now, it has been argued that only transition metal catalysts (usually Co or Fe) are active toward the CO hydrogenation and subsequent chain growth in the presence of hydrogen. In this paper, we demonstrate that carbon nanotube (CNT) surfaces are also capable of hydro-deoxygenating CO and producing long chain hydrocarbons similar to that obtained through the FTS but with orders of magnitude higher conversion efficiencies than the present state-of-the-art FTS catalysts. We have used advanced experimental tools such as XPS and microscopy techniques to characterize CNTs and identify C-O functional groups as the active sites for the enhanced catalytic activity. Furthermore, we have conducted quantum Density Functional Theory (DFT) calculations to confirm that C-O groups (inherent on CNT surfaces) could indeed be catalytically active towards reduction of CO with H2, and capable of sustaining chain growth. The DFT calculations have shown that the kinetically and thermodynamically feasible route for CO insertion and hydro-deoxygenation are different from that on transition metal catalysts. Experiments on a continuous flow tubular reactor with various nearly metal-free CNTs have been carried out and the products have been analyzed. CNTs functionalized by various methods were evaluated under different conditions. Reactor tests revealed that the hydrogen pre-treatment reduced the activity of the catalysts to negligible levels. Without the pretreatment, the activity for CO conversion as found to be 7 µmol CO/g CNT/s. The O-functionalized samples showed very activities greater than 85 µmol CO/g CNT/s with nearly 100% conversion. Analyses show that CO hydro-deoxygenation occurred at the C-O/O-H functional groups. It was found that while the products were similar to FT products, differences in selectivities were observed which, in turn, was a result of a different catalytic mechanism. These findings now open a new paradigm for CNT-based hydrogenation catalysts and constitute a defining point for obtaining clean, earth abundant, alternative fuels through the use of efficient and renewable catalyst.

Keywords: CNT, CO Hydrodeoxygenation, DFT, liquid fuels, XPS, XTL

Procedia PDF Downloads 349
8225 Teaching Buddhist Meditation: An Investigation into Self-Learning Methods

Authors: Petcharat Lovichakorntikul, John Walsh

Abstract:

Meditation is in the process of becoming a globalized practice and its benefits have been widely acknowledged. The first wave of internationalized meditation techniques and practices was represented by Chan and Zen Buddhism and a new wave of practice has arisen in Thailand as part of the Phra Dhammakaya temple movement. This form of meditation is intended to be simple and straightforward so that it can easily be taught to people unfamiliar with the basic procedures and philosophy. This has made Phra Dhammakaya an important means of outreach to the international community. One notable aspect is to encourage adults to become like children to perform it – that is, to return to a naïve state prior to the adoption of ideology as a means of understanding the world. It is said that the Lord Buddha achieved the point of awakening at the age of seven and Phra Dhammakaya has a program to teach meditation to both children and adults. This brings about the research question of how practitioners respond to the practice of meditation and how should they be taught? If a careful understanding of how children behave can be achieved, then it will help in teaching adults how to become like children (albeit idealized children) in their approach to meditation. This paper reports on action research in this regard. Personal interviews and focus groups are held with a view to understanding self-learning methods with respect to Buddhist meditation and understanding and appreciation of the practices involved. The findings are considered in the context of existing knowledge about different learning techniques among people of different ages. The implications for pedagogical practice are discussed and learning methods are outlined.

Keywords: Buddhist meditation, Dhammakaya, meditation technique, pedagogy, self-learning

Procedia PDF Downloads 481
8224 Numerical Study of Jet Impingement Heat Transfer

Authors: A. M. Tiara, Sudipto Chakraborty, S. K. Pal

Abstract:

Impinging jets and their different configurations are important from the viewpoint of the fluid flow characteristics and their influence on heat transfer from metal surfaces due to their complex flow characteristics. Such flow characteristics results in highly variable heat transfer from the surface, resulting in varying cooling rates which affects the mechanical properties including hardness and strength. The overall objective of the current research is to conduct a fundamental investigation of the heat transfer mechanisms for an impinging coolant jet. Numerical simulation of the cooling process gives a detailed analysis of the different parameters involved even though employing Computational Fluid Dynamics (CFD) to simulate the real time process, being a relatively new research area, poses many challenges. The heat transfer mechanism in the current research is actuated by jet cooling. The computational tool used in the ongoing research for simulation of the cooling process is ANSYS Workbench software. The temperature and heat flux distribution along the steel strip with the effect of various flow parameters on the heat transfer rate can be observed in addition to determination of the jet impingement patterns, which is the major aim of the present analysis. Modelling both jet and air atomized cooling techniques using CFD methodology and validating with those obtained experimentally- including trial and error with different models and comparison of cooling rates from both the techniques have been included in this work. Finally some concluding remarks are made that identify some gaps in the available literature that have influenced the path of the current investigation.

Keywords: CFD, heat transfer, impinging jets, numerical simulation

Procedia PDF Downloads 238
8223 Modelling Soil Inherent Wind Erodibility Using Artifical Intellligent and Hybrid Techniques

Authors: Abbas Ahmadi, Bijan Raie, Mohammad Reza Neyshabouri, Mohammad Ali Ghorbani, Farrokh Asadzadeh

Abstract:

In recent years, vast areas of Urmia Lake in Dasht-e-Tabriz has dried up leading to saline sediments exposure on the surface lake coastal areas being highly susceptible to wind erosion. This study was conducted to investigate wind erosion and its relevance to soil physicochemical properties and also modeling of wind erodibility (WE) using artificial intelligence techniques. For this purpose, 96 soil samples were collected from 0-5 cm depth in 414000 hectares using stratified random sampling method. To measure the WE, all samples (<8 mm) were exposed to 5 different wind velocities (9.5, 11, 12.5, 14.1 and 15 m s-1 at the height of 20 cm) in wind tunnel and its relationship with soil physicochemical properties was evaluated. According to the results, WE varied within the range of 76.69-9.98 (g m-2 min-1)/(m s-1) with a mean of 10.21 and coefficient of variation of 94.5% showing a relatively high variation in the studied area. WE was significantly (P<0.01) affected by soil physical properties, including mean weight diameter, erodible fraction (secondary particles smaller than 0.85 mm) and percentage of the secondary particle size classes 2-4.75, 1.7-2 and 0.1-0.25 mm. Results showed that the mean weight diameter, erodible fraction and percentage of size class 0.1-0.25 mm demonstrated stronger relationship with WE (coefficients of determination were 0.69, 0.67 and 0.68, respectively). This study also compared efficiency of multiple linear regression (MLR), gene expression programming (GEP), artificial neural network (MLP), artificial neural network based on genetic algorithm (MLP-GA) and artificial neural network based on whale optimization algorithm (MLP-WOA) in predicting of soil wind erodibility in Dasht-e-Tabriz. Among 32 measured soil variable, percentages of fine sand, size classes of 1.7-2.0 and 0.1-0.25 mm (secondary particles) and organic carbon were selected as the model inputs by step-wise regression. Findings showed MLP-WOA as the most powerful artificial intelligence techniques (R2=0.87, NSE=0.87, ME=0.11 and RMSE=2.9) to predict soil wind erodibility in the study area; followed by MLP-GA, MLP, GEP and MLR and the difference between these methods were significant according to the MGN test. Based on the above finding MLP-WOA may be used as a promising method to predict soil wind erodibility in the study area.

Keywords: wind erosion, erodible fraction, gene expression programming, artificial neural network

Procedia PDF Downloads 73
8222 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data

Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard

Abstract:

Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.

Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset

Procedia PDF Downloads 10
8221 Innovation Management: A Comparative Analysis among Organizations from United Arab Emirates, Saudi Arabia, Brazil and China

Authors: Asmaa Abazaid, Maram Al-Ostah, Nadeen Abu-Zahra, Ruba Bawab, Refaat Abdel-Razek

Abstract:

Innovation audit is defined as a tool that can be used to reflect on how the innovation is managed in an organization. The aim of this study is to audit innovation in the second top Engineering Firms in the world, and one of the Small Medium Enterprises (SMEs) companies that are working in United Arab Emirates (UAE). The obtained results are then compared with four international companies from China and Brazil. The Diamond model has been used for auditing innovation in the two companies in UAE to evaluate their innovation management and to identify each company’s strengths and weaknesses from an innovation perspective. The results of the comparison between the two companies (Jacobs and Hyper General Contracting) revealed that Jacobs has support for innovation, its innovation processes are well managed, the company is committed to the development of its employees worldwide and the innovation system is flexible. Jacobs was doing best in all innovation management dimensions: strategy, process, organization, linkages and learning, while Hyper General Contracting did not score as Jacobs in any of the innovation dimensions. Furthermore, the audit results of both companies were compared with international companies to examine how well the two construction companies in UAE manage innovation relative to SABIC (Saudi company), Poly Easy and Arnious (Brazilian companies), Huagong tools and Guizohou Yibai (Chinese companies). The results revealed that Jacobs is doing best in learning and organization dimensions, while PolyEasy and Jacobs are equal in the linkage dimension. Huagong Tools scored the highest score in process dimension among all the compared companies. However, the highest score of strategy dimension was given to PolyEasy. On the other hand, Hyper General Contracting scored the lowest in all of the innovation management dimensions. It needs to improve its management of all the innovation management dimensions with special attention to be given to strategy, process, and linkage as they got scores below 4 out of 7 comparing with other dimensions. Jacobs scored the highest in three innovation management dimensions related to the six companies. However, the strategy dimension is considered low, and special attention is needed in this dimension.

Keywords: Brazil, China, innovation audit, innovation evaluation, innovation management, Saudi Arabia, United Arab Emirates

Procedia PDF Downloads 286
8220 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics

Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir

Abstract:

Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.

Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone

Procedia PDF Downloads 194
8219 Artificial Intelligance Features in Canva

Authors: Amira Masood, Zainah Alshouri, Noor Bantan, Samira Kutbi

Abstract:

Artificial intelligence is continuously becoming more advanced and more widespread and is present in many of our day-to-day lives as a means of assistance in numerous different fields. A growing number of people, companies, and corporations are utilizing Canva and its AI tools as a method of quick and easy media production. Hence, in order to test the integrity of the rapid growth of AI, this paper will explore the usefulness of Canva's advanced design features as well as their accuracy by determining user satisfaction through a survey-based research approach and by investigating whether or not AI is successful enough that it eliminates the need for human alterations.

Keywords: artificial intelligence, canva, features, users, satisfaction

Procedia PDF Downloads 110
8218 Resonant Fluorescence in a Two-Level Atom and the Terahertz Gap

Authors: Nikolai N. Bogolubov, Andrey V. Soldatov

Abstract:

Terahertz radiation occupies a range of frequencies somewhere from 100 GHz to approximately 10 THz, just between microwaves and infrared waves. This range of frequencies holds promise for many useful applications in experimental applied physics and technology. At the same time, reliable, simple techniques for generation, amplification, and modulation of electromagnetic radiation in this range are far from been developed enough to meet the requirements of its practical usage, especially in comparison to the level of technological abilities already achieved for other domains of the electromagnetic spectrum. This situation of relative underdevelopment of this potentially very important range of electromagnetic spectrum is known under the name of the 'terahertz gap.' Among other things, technological progress in the terahertz area has been impeded by the lack of compact, low energy consumption, easily controlled and continuously radiating terahertz radiation sources. Therefore, development of new techniques serving this purpose as well as various devices based on them is of obvious necessity. No doubt, it would be highly advantageous to employ the simplest of suitable physical systems as major critical components in these techniques and devices. The purpose of the present research was to show by means of conventional methods of non-equilibrium statistical mechanics and the theory of open quantum systems, that a thoroughly studied two-level quantum system, also known as an one-electron two-level 'atom', being driven by external classical monochromatic high-frequency (e.g. laser) field, can radiate continuously at much lower (e.g. terahertz) frequency in the fluorescent regime if the transition dipole moment operator of this 'atom' possesses permanent non-equal diagonal matrix elements. This assumption contradicts conventional assumption routinely made in quantum optics that only the non-diagonal matrix elements persist. The conventional assumption is pertinent to natural atoms and molecules and stems from the property of spatial inversion symmetry of their eigenstates. At the same time, such an assumption is justified no more in regard to artificially manufactured quantum systems of reduced dimensionality, such as, for example, quantum dots, which are often nicknamed 'artificial atoms' due to striking similarity of their optical properties to those ones of the real atoms. Possible ways to experimental observation and practical implementation of the predicted effect are discussed too.

Keywords: terahertz gap, two-level atom, resonant fluorescence, quantum dot, resonant fluorescence, two-level atom

Procedia PDF Downloads 273
8217 The Impact of Enhanced Recovery after Surgery (ERAS) Protocols on Anesthesia Management in High-Risk Surgical Patients

Authors: Rebar Mohammed Hussein

Abstract:

Enhanced Recovery After Surgery (ERAS) protocols have transformed perioperative care, aiming to reduce surgical stress, optimize pain management, and accelerate recovery. This study evaluates the impact of ERAS on anesthesia management in high-risk surgical patients, focusing on opioid-sparing techniques and multimodal analgesia. A retrospective analysis was conducted on patients undergoing major surgeries within an ERAS program, comparing outcomes with a historical cohort receiving standard care. Key metrics included postoperative pain scores, opioid consumption, length of hospital stay, and complication rates. Results indicated that the implementation of ERAS protocols significantly reduced postoperative opioid use by 40% and improved pain management outcomes, with 70% of patients reporting satisfactory pain control on postoperative day one. Additionally, patients in the ERAS group experienced a 30% reduction in length of stay and a 20% decrease in complication rates. These findings underscore the importance of integrating ERAS principles into anesthesia practice, particularly for high-risk patients, to enhance recovery, improve patient satisfaction, and reduce healthcare costs. Future directions include prospective studies to further refine anesthesia techniques within ERAS frameworks and explore their applicability across various surgical specialties.

Keywords: ERAS protocols, high-risk surgical patients, anesthesia management, recovery

Procedia PDF Downloads 33
8216 Investigation of the EEG Signal Parameters during Epileptic Seizure Phases in Consequence to the Application of External Healing Therapy on Subjects

Authors: Karan Sharma, Ajay Kumar

Abstract:

Epileptic seizure is a type of disease due to which electrical charge in the brain flows abruptly resulting in abnormal activity by the subject. One percent of total world population gets epileptic seizure attacks.Due to abrupt flow of charge, EEG (Electroencephalogram) waveforms change. On the display appear a lot of spikes and sharp waves in the EEG signals. Detection of epileptic seizure by using conventional methods is time-consuming. Many methods have been evolved that detect it automatically. The initial part of this paper provides the review of techniques used to detect epileptic seizure automatically. The automatic detection is based on the feature extraction and classification patterns. For better accuracy decomposition of the signal is required before feature extraction. A number of parameters are calculated by the researchers using different techniques e.g. approximate entropy, sample entropy, Fuzzy approximate entropy, intrinsic mode function, cross-correlation etc. to discriminate between a normal signal & an epileptic seizure signal.The main objective of this review paper is to present the variations in the EEG signals at both stages (i) Interictal (recording between the epileptic seizure attacks). (ii) Ictal (recording during the epileptic seizure), using most appropriate methods of analysis to provide better healthcare diagnosis. This research paper then investigates the effects of a noninvasive healing therapy on the subjects by studying the EEG signals using latest signal processing techniques. The study has been conducted with Reiki as a healing technique, beneficial for restoring balance in cases of body mind alterations associated with an epileptic seizure. Reiki is practiced around the world and is recommended for different health services as a treatment approach. Reiki is an energy medicine, specifically a biofield therapy developed in Japan in the early 20th century. It is a system involving the laying on of hands, to stimulate the body’s natural energetic system. Earlier studies have shown an apparent connection between Reiki and the autonomous nervous system. The Reiki sessions are applied by an experienced therapist. EEG signals are measured at baseline, during session and post intervention to bring about effective epileptic seizure control or its elimination altogether.

Keywords: EEG signal, Reiki, time consuming, epileptic seizure

Procedia PDF Downloads 407
8215 Conceptual Model for Logistics Information System

Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves

Abstract:

Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.

Keywords: system, information, conceptual model, logistics

Procedia PDF Downloads 501
8214 Kannada HandWritten Character Recognition by Edge Hinge and Edge Distribution Techniques Using Manhatan and Minimum Distance Classifiers

Authors: C. V. Aravinda, H. N. Prakash

Abstract:

In this paper, we tried to convey fusion and state of art pertaining to SIL character recognition systems. In the first step, the text is preprocessed and normalized to perform the text identification correctly. The second step involves extracting relevant and informative features. The third step implements the classification decision. The three stages which involved are Data acquisition and preprocessing, Feature extraction, and Classification. Here we concentrated on two techniques to obtain features, Feature Extraction & Feature Selection. Edge-hinge distribution is a feature that characterizes the changes in direction of a script stroke in handwritten text. The edge-hinge distribution is extracted by means of a windowpane that is slid over an edge-detected binary handwriting image. Whenever the mid pixel of the window is on, the two edge fragments (i.e. connected sequences of pixels) emerging from this mid pixel are measured. Their directions are measured and stored as pairs. A joint probability distribution is obtained from a large sample of such pairs. Despite continuous effort, handwriting identification remains a challenging issue, due to different approaches use different varieties of features, having different. Therefore, our study will focus on handwriting recognition based on feature selection to simplify features extracting task, optimize classification system complexity, reduce running time and improve the classification accuracy.

Keywords: word segmentation and recognition, character recognition, optical character recognition, hand written character recognition, South Indian languages

Procedia PDF Downloads 500
8213 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, J. Franke

Abstract:

The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.

Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production

Procedia PDF Downloads 736
8212 Effects of Fermentation Techniques on the Quality of Cocoa Beans

Authors: Monday O. Ale, Adebukola A. Akintade, Olasunbo O. Orungbemi

Abstract:

Fermentation as an important operation in the processing of cocoa beans is now affected by the recent climate change across the globe. The major requirement for effective fermentation is the ability of the material used to retain sufficient heat for the required microbial activities. Apart from the effects of climate on the rate of heat retention, the materials used for fermentation plays an important role. Most Farmers still restrict fermentation activities to the use of traditional methods. Improving on cocoa fermentation in this era of climate change makes it necessary to work on other materials that can be suitable for cocoa fermentation. Therefore, the objective of this study was to determine the effects of fermentation techniques on the quality of cocoa beans. The materials used in this fermentation research were heap-leaves (traditional), stainless steel, plastic tin, plastic basket and wooden box. The period of fermentation varies from zero days to 10 days. Physical and chemical tests were carried out for variables in quality determination in the samples. The weight per bean varied from 1.0-1.2 g after drying across the samples and the major color of the dry beans observed was brown except with the samples from stainless steel. The moisture content varied from 5.5-7%. The mineral content and the heavy metals decreased with increase in the fermentation period. A wooden box can conclusively be used as an alternative to heap-leaves as there was no significant difference in the physical features of the samples fermented with the two methods. The use of a wooden box as an alternative for cocoa fermentation is therefore recommended for cocoa farmers.

Keywords: fermentation, effects, fermentation materials, period, quality

Procedia PDF Downloads 210